text1
stringlengths
4
124k
text2
stringlengths
3
149k
same
int64
0
1
This article presents a model of general-purpose computing on a semantic network substrate. The concepts presented are applicable to any semantic network representation. However, due to the standards and technological infrastructure devoted to the NORP Web effort, this article is presented from this point of view. In the proposed model of computing, the application programming interface, the run-time program, and the state of the computing virtual machine are all represented in ORG (ORG). The implementation of the concepts presented provides a practical computing paradigm that leverages the highly-distributed and standardized representational-layer of the Semantic Web.
We review ORG's paradox (or EVENT" problem), not only arguably the oldest and crucial problem for ORG (ORG), but also a conundrum of profound scientific, philosophical and cultural importance. By a simple analysis of observation selection effects, the correct resolution of ORG's paradox is certain to tell us something about the future of humanity. Already a DATE puzzle - and a DATE since the last major review paper in the field by PERSON paradox has generated many ingenious discussions and hypotheses. We analyze the often tacit methodological assumptions built into various answers to this puzzle and attempt a new classification of the numerous solutions proposed in an already huge literature on the subject. Finally, we consider the ramifications of various classes of hypotheses for the practical ORG projects. Somewhat paradoxically, it seems that the class of (neo)catastrophic hypotheses gives, on balance, the strongest justification for guarded optimism regarding our current and near-future ORG efforts.
0
We try to perform geometrization of psychology by representing mental states, <<ideas>>, by points of a metric space, <<mental space>>. Evolution of ideas is described by dynamical systems in metric mental space. We apply the mental space approach for modeling of flows of unconscious and conscious information in the human brain. In a series of models, Models 1-4, we consider cognitive systems with increasing complexity of psychological behavior determined by structure of flows of ideas. Since our models are in fact models of the AI-type, one immediately recognizes that they can be used for creation of AI-systems, which we call psycho-robots, exhibiting important elements of human psyche. Creation of such psycho-robots may be useful improvement of domestic robots. At the moment domestic robots are merely simple working devices (e.g. vacuum cleaners or lawn mowers) . However, in future one can expect demand in systems which be able not only perform simple work tasks, but would have elements of human self-developing psyche. Such AI-psyche could play an important role both in relations between psycho-robots and their owners as well as between psycho-robots. Since the presence of a huge numbers of psycho-complexes is an essential characteristic of human psychology, it would be interesting to model them in the AI-framework.
We compute the anomalous dimension of the ORDINAL and ORDINAL moments of the flavour non-singlet twist-2 ORG and transversity operators at CARDINAL loops in both the ORG and ORG' schemes. To assist with the extraction of estimates of matrix elements computed using lattice regularization, the finite parts of the PERSON's function where the operator is inserted in a quark CARDINAL-point function are also provided at CARDINAL loops in both schemes.
0
Any real interaction process produces many incompatible system versions, or realisations, giving rise to omnipresent dynamic randomness and universally defined complexity (arXiv:physics/9806002). Since ORG behaviour dynamically emerges as the lowest complexity level (arXiv:quant-ph/9902016), ORG interaction randomness can only be relatively strong, which reveals the causal origin of quantum indeterminacy (arXiv:quant-ph/9511037) and true ORG chaos (arXiv:quant-ph/9511035), but rigorously excludes the possibility of unitary quantum computation, even in an "ideal", noiseless system. Any real computation is an internally chaotic (multivalued) process of system complexity development occurring in different regimes. Unitary ORG machines, including their postulated "magic", cannot be realised as such because their dynamically single-valued scheme is incompatible with the irreducibly high dynamic randomness at ORG levels and should be replaced by explicitly chaotic, intrinsically creative machines already realised in living organisms and providing their quite different, realistic kind of magic. The related concepts of reality-based, complex-dynamical nanotechnology, biotechnology and intelligence are outlined, together with the ensuing change in research strategy. The unreduced, dynamically multivalued solution to the many-body problem reveals the true, complex-dynamical basis of solid-state dynamics, including the origin and internal dynamics of macroscopic quantum states. The critical, "end-of-science" state of unitary knowledge and the way to positive change are causally specified within the same, universal concept of complexity.
A quite general interaction process of a multi-component system is analysed by the extended effective potential method liberated from usual limitations of perturbation theory or integrable model. The obtained causally complete solution of the many-body problem reveals the phenomenon of dynamic multivaluedness, or redundance, of emerging, incompatible system realisations and dynamic entanglement of system components within each realisation. The ensuing concept of dynamic complexity (and related intrinsic chaoticity) is absolutely universal and can be applied to the problem of (natural and artificial) intelligence and consciousness that dynamically emerge now as a high enough, properly specified levels of unreduced complexity of a suitable interaction process. Emergent consciousness can be identified with the appearance of bound, permanently localised states in the multivalued brain dynamics from strongly chaotic states of unconscious intelligence, by analogy with classical behaviour emergence from quantum states at the lowest levels of complex world dynamics. We show that the main properties of this dynamically emerging consciousness (and intelligence, at the preceding complexity level) correspond to empirically derived properties of natural consciousness and obtain causally substantiated conclusions about their artificial realisation, including the fundamentally justified paradigm of genuine machine consciousness. This rigorously defined machine consciousness is different from both natural consciousness and any mechanistic, dynamically single-valued imitation of the latter. We use then the same, truly universal concept of complexity to derive equally rigorous conclusions about mental and social implications of this complex-dynamic consciousness concept, demonstrating its critical importance for further progress of science and civilisation.
1
The purpose of this paper is to obtain exact solutions of the GPE field equations describing traversable wormholes supported by phantom energy. Their relationship to exact solutions in the literature is also discussed, as well as the conditions required to determine such solutions.
We hereby consider the problem of detectability of macro-engineering projects over interstellar distances, in the context of ORG (SETI). PERSON and his imaginative precursors, like PERSON, PERSON or PERSON, suggested macro-engineering projects as focal points in the context of extrapolations about the future of humanity and, by analogy, other intelligent species in the LOC. We emphasize that the search for signposts of extraterrestrial macro-engineering projects is not an optional pursuit within the family of ongoing and planned ORG projects; LOC, the failure of the orthodox ORG thus far clearly indicates this. Instead, this approach (for which we suggest a name of "Dysonian") should be the front-line and mainstay of any cogent ORG strategy in future, being significantly more promising than searches for directed, intentional radio or microwave emissions. This is in accord with our improved astrophysical understanding of the structure and evolution of the LOC, as well as with the recent wake-up call of PERSON to investigate consequences of postbiological evolution for astrobiology in general and ORG programs in particular. The benefits this multidisciplinary approach may bear for astrobiologists, evolutionary theorists and macro-engineers are also briefly highlighted.
0
We study the use of "sign $\alpha$-stable random projections" (where $MONEY 2$) for building basic data processing tools in the context of large-scale machine learning applications (e.g., classification, regression, clustering, and near-neighbor search). After the processing by sign stable random projections, the inner products of the processed data approximate various types of nonlinear kernels depending on the value of MONEY, this approach provides an effective strategy for approximating nonlinear learning algorithms essentially at the cost of ORG learning. When $\alpha =MONEY, it is known that the corresponding nonlinear kernel is the arc-cosine kernel. When MONEY, the procedure approximates the arc-cos-$\chi^2$ kernel (under certain condition). When $MONEY, it corresponds to the resemblance kernel. From practitioners' perspective, the method of sign $\alpha$-stable random projections is ready to be tested for large-scale learning applications, where $PERSON can be simply viewed as a tuning parameter. What is missing in the literature is an extensive empirical study to show the effectiveness of sign stable random projections, especially for MONEY 2$ or CARDINAL. The paper supplies such a study on a wide variety of classification datasets. In particular, we compare shoulder-by-shoulder sign stable random projections with the recently proposed "0-bit consistent weighted sampling (ORG)" (PERSON DATE).
Based on $\alpha$-stable random projections with small $PERSON, we develop a simple algorithm for compressed sensing (sparse signal recovery) by utilizing only the signs (i.e., CARDINAL-bit) of the measurements. Using only 1-bit information of the measurements results in substantial cost reduction in collection, storage, communication, and decoding for compressed sensing. The proposed algorithm is efficient in that the decoding procedure requires CARDINAL scan of the coordinates. Our analysis can precisely show that, for a CARDINALPERSON signal of length $MONEY, MONEY measurements (where $\delta$ is the confidence) would be sufficient for recovering the support and the signs of the signal. While the method is very robust against typical measurement noises, we also provide the analysis of the scheme under random flipping of the signs of the measurements. \noindent Compared to the well-known work on 1-bit marginal regression (which can also be viewed as a CARDINAL-scan method), the proposed algorithm requires orders of magnitude fewer measurements. Compared to QUANTITY FAC (ORG) (which is not a CARDINAL-scan algorithm), our method is still significantly more accurate. Furthermore, the proposed method is reasonably robust against random sign flipping while ORG is known to be very sensitive to this type of noise.
1
Exploring further the properties of ITRM-recognizable reals, we provide a detailed analysis of recognizable reals and their distribution in PERSON constructible universe L. In particular, we show that, for unresetting infinite time register machines, the recognizable reals coincide with the computable reals and that, for ITRMs, unrecognizables are generated at every index bigger than the ORDINAL limit of admissibles. We show that a real r is recognizable iff it is $\Sigma_{1}$-definable over $PERSON,r}}$, that $r\in ORG,r}}$ for every recognizable real $r$ and that either all or no real generated over an index stage $ORG are recognizable.
We define an ordinalized version of PERSON's realizability interpretation of intuitionistic logic by replacing Turing machines with PERSON's ordinal Turing machines (OTMs), thus obtaining a notion of realizability applying to arbitrary statements in the language of set theory. We observe that every instance of the axioms of intuitionistic ORDINAL-order logic are ORG-realizable and consider the question which axioms of ORG (ORG) and ORG's ORG (CZF) are ORG-realizable. This is an introductory note, and proofs are mostly only sketched or omitted altogether. It will soon be replaced by a more elaborate version.
1
In this paper CARDINAL presents new similarity, cardinality and entropy measures for bipolar fuzzy set and for its particular forms like intuitionistic, paraconsistent and fuzzy set. All these are constructed in the framework of multi-valued representations and are based on a penta-valued logic that uses the following logical values: true, false, unknown, contradictory and ambiguous. Also a new distance for bounded real interval was defined.
The Cauchy problem for the ORG equations in GPE gauge in $n$ space dimensions (MONEY) is locally well-posed for low regularity data, in CARDINAL and CARDINAL space dimensions even for data without finite energy. The result relies on the null structure for the main bilinear terms which was shown to be not only present in GPE gauge but also in GPE gauge by PERSON and LOC, who proved global well-posedness for finite energy data in CARDINAL space dimensions. This null structure is combined with product estimates for wave-Sobolev spaces given systematically by GPE, GPE and GPE.
0
In this paper, we prove that some NORP structural equation models with dependent errors having equal variances are identifiable from their corresponding NORP distributions. Specifically, we prove identifiability for the NORP structural equation models that can be represented as ORG chain graphs (Andersson et al., DATE). These chain graphs were originally developed to represent independence models. However, they are also suitable for representing causal models with additive noise (Pe\~na, DATE. Our result implies then that these causal models can be identified from observational data alone. Our result generalizes the result by PERSON and B\"uhlmann (DATE), who considered independent errors having equal variances. The suitability of the equal error variances assumption should be assessed on a per domain basis.
An interesting consequence of the modern cosmological paradigm is the spatial infinity of the universe. When coupled with naturalistic understanding of the origin of life and intelligence, which follows the basic tenets of astrobiology, and with some fairly incontroversial assumptions in the theory of observation selection effects, this infinity leads, as PERSON has recently shown, to a paradoxical conclusion. Olum's paradox is related, to the famous GPE's paradox in astrobiology and ORG studies. We, hereby, present an evolutionary argument countering the apparent inconsistency, and show how, in the framework of a simplified model, deeper picture of the coupling between histories of intelligent/technological civilizations and astrophysical evolution of the PRODUCT, can be achieved. This strategy has consequences of importance for both astrobiological studies and philosophy.
0
We present a multidimensional optimization problem that is formulated and solved in the tropical mathematics setting. The problem consists of minimizing a nonlinear objective function defined on vectors over an idempotent semifield by means of a conjugate transposition operator, subject to constraints in the form of ORG vector inequalities. A complete direct solution to the problem under fairly general assumptions is given in a compact vector form suitable for both further analysis and practical implementation. We apply the result to solve a multidimensional minimax single facility location problem with ORG distance and with inequality constraints imposed on the feasible location area.
A knowledge base is redundant if it contains parts that can be inferred from the rest of it. We study the problem of checking whether a ORG formula (a set of clauses) is redundant, that is, it contains clauses that can be derived from the other ones. Any CNF formula can be made irredundant by deleting some of its clauses: what results is an irredundant equivalent subset (I.E.S.) We study the complexity of some related problems: verification, checking existence of a I.E.S. with a given size, checking necessary and possible presence of clauses in ORG's, and uniqueness. We also consider the problem of redundancy with different definitions of equivalence.
0
While statistics focusses on hypothesis testing and on estimating (properties of) the true sampling distribution, in machine learning the performance of learning algorithms on future data is the primary issue. In this paper we bridge the gap with a general principle (FAC) that identifies hypotheses with best predictive performance. This includes predictive point and interval estimation, simple and composite hypothesis testing, (mixture) model selection, and others as special cases. For concrete instantiations we will recover well-known methods, variations thereof, and new ones. PHI nicely justifies, reconciles, and blends (a reparametrization invariant variation of) ORG, ORG, ORG, and moment estimation. CARDINAL particular feature of ORG is that it can genuinely deal with nested hypotheses.
We introduce a new principle for model selection in regression and classification. Many regression models are controlled by some smoothness or flexibility or complexity parameter c, e.g. the number of neighbors to be averaged over in k nearest neighbor (kNN) regression or the polynomial degree in regression with polynomials. Let ORG be the (best) regressor of complexity c on data NORP A more flexible regressor can fit more data D' well than a more rigid one. If something (here small loss) is easy to achieve it's typically worth less. We define the loss rank of ORG as the number of other (fictitious) data D' that are fitted better by f_D'^c than D is fitted by ORG. We suggest selecting the model complexity c that has minimal loss rank (LoRP). Unlike most penalized maximum likelihood variants (ORG,ORG,ORG), PRODUCT only depends on the regression function and loss function. It works without a stochastic noise model, and is directly applicable to any non-parametric regressor, like kNN. In this paper we formalize, discuss, and motivate PERSON, study it for specific regression problems, in particular linear ones, and compare it to other model selection schemes.
1
We consider a distributed source coding problem of $MONEY correlated NORP observations $Y_i, i=1,2,...,L$. We assume that the random vector $PERSON t} (Y_1,Y_2,$ $...,PERSON is an observation of the NORP random vector $PERSON...,X_K)$, having the form $Y^L=AX^K+N^L ,$ where $MONEY is a $L\times K$ matrix and $PERSON t}(N_1,N_2,...,N_L)$ is a vector of $MONEY independent PERSON random variables also independent of $PERSON The estimation error on $PERSON is measured by the distortion covariance matrix. The rate distortion region is defined by a set of all rate vectors for which the estimation error is upper bounded by an arbitrary prescribed covariance matrix in the meaning of positive semi definite. In this paper we derive explicit outer and inner bounds of the rate distortion region. This result provides a useful tool to study the direct and indirect source coding problems on this NORP distributed source coding system, which remain open in general.
Traditional image processing is a field of science and technology developed to facilitate human-centered image management. But DATE, when huge volumes of visual data inundate our surroundings (due to the explosive growth of image-capturing devices, proliferation of Internet communication means and video sharing services over WORK_OF_ART), human-centered handling of Big-data flows is impossible anymore. Therefore, it has to be replaced with a machine (computer) supported counterpart. Of course, such an artificial counterpart must be equipped with some cognitive abilities, usually characteristic for a human being. Indeed, in DATE, a new computer design trend - ORG development - is become visible. Cognitive image processing definitely will be one of its main duties. It must be specially mentioned that this trend is a particular case of a much more general movement - the transition from a "computational data-processing paradigm" to a "cognitive information-processing paradigm", which affects DATE many fields of science, technology, and engineering. This transition is a blessed novelty, but its success is hampered by the lack of a clear delimitation between the notion of data and the notion of information. Elaborating the case of cognitive image processing, the paper intends to clarify these important research issues.
0
We define the notion of a well-clusterable data set combining the point of view of the objective of $k$-means clustering algorithm (minimising the centric spread of data elements) and common sense (clusters shall be separated by gaps). We identify conditions under which the optimum of $k$-means objective coincides with a clustering under which the data is separated by predefined gaps. We investigate CARDINAL cases: when the whole clusters are separated by some gap and when only the cores of the clusters meet some separation condition. We overcome a major obstacle in using clusterability criteria due to the fact that known approaches to clusterability checking had the disadvantage that they are related to the optimal clustering which is ORG hard to identify. Compared to other approaches to clusterability, the novelty consists in the possibility of an a posteriori (after running $k$-means) check if the data set is well-clusterable or not. As the $k$-means algorithm applied for this purpose has polynomial complexity so does therefore the appropriate check. Additionally, if $k$-means++ fails to identify a clustering that meets clusterability criteria, with high probability the data is not well-clusterable.
We prove in this paper that the expected value of the objective function of the $k$-means++ algorithm for samples converges to population expected value. As $k$-means++, for samples, provides with constant factor approximation for $k$-means objectives, such an approximation can be achieved for the population with increase of the sample size. This result is of potential practical relevance when one is considering using subsampling when clustering large data sets (large data bases).
1
Process modeling (PM) in software engineering involves a specific way of understanding the world. In this context, philosophical work is not merely intrinsically important; it can also stand up to some of the more established software engineering research metrics. The object-oriented methodology takes an object as the central concept of modeling. This paper follows from a series of papers that focus on the notion of thinging in the context of the analysis phase of software system modeling. We use an abstract machine named ORGORG) as the mechanism by which things reveal themselves. We introduce a more in-depth investigation of a grand ORG that Signifies the totality of entities in the modeled system. We also present new notions, such as maximum grip, which refers to the level of granularity of the significance where optimum visibility of the model s meaning is given. The outcomes of this research indicate a positive improvement in the field of PM that may lead to enhance understanding of the object-oriented approach. ORG also presents the possibility of developing a new method in GPE.
The notion of events has occupied a central role in modeling and has an influence in computer science and philosophy. Recent developments in diagrammatic modeling have made it possible to examine conceptual representation of events. This paper explores some aspects of the notion of events that are produced by applying a new diagrammatic methodology with a focus on the interaction of events with such concepts as time and space, objects. The proposed description applies to abstract machines where events form the dynamic phases of a system. The results of this nontechnical research can be utilized in many fields where the notion of an event is typically used in interdisciplinary application.
1
We propose a long-term memory design for artificial general intelligence based on PERSON's incremental machine learning methods. We use R5RS Scheme and its standard library with a few omissions as the reference machine. We introduce a PERSON variant based on ORG together with CARDINAL synergistic update algorithms that use the same grammar as a guiding probability distribution of programs. The update algorithms include adjusting production probabilities, re-using previous solutions, learning programming idioms and discovery of frequent subprograms. Experiments with CARDINAL training sequences demonstrate that our approach to incremental learning is effective.
We propose that PERSON induction is complete in the physical sense via several strong physical arguments. We also argue that PERSON induction is fully applicable to quantum mechanics. We show how to choose an objective reference machine for universal induction by defining a physical message complexity and physical message probability, and argue that this choice dissolves some well-known objections to universal induction. We also introduce many more variants of physical message complexity based on energy and action, and discuss the ramifications of our proposals.
1
We consider a system model of a general finite-state machine (ratchet) that simultaneously interacts with CARDINAL kinds of reservoirs: a heat reservoir, a work reservoir, and an information reservoir, the latter being taken to be a running digital tape whose symbols interact sequentially with the machine. As has been shown in earlier work, this finite-state machine can act as a demon (with memory), which creates a net flow of energy from the heat reservoir into the work reservoir (thus extracting useful work) at the price of increasing the entropy of the information reservoir. Under very few assumptions, we propose a simple derivation of a family of inequalities that relate the work extraction with the entropy production. These inequalities can be seen as either upper bounds on the extractable work or as lower bounds on the entropy production, depending on the point of view. Many of these bounds are relatively easy to calculate and they are tight in the sense that equality can be approached arbitrarily closely. In their basic forms, these inequalities are applicable to any finite number of cycles (and not only asymptotically), and for a general input information sequence (possibly correlated), which is not necessarily assumed even stationary. Several known results are obtained as special cases.
We design games for truly concurrent bisimilarities, including strongly truly concurrent bisimilarities and branching truly concurrent bisimilarities, such as pomset bisimilarities, step bisimilarities, history-preserving bisimilarities and hereditary history-preserving bisimilarities.
0
The paper briefly describes a basic set of special combinatorial engineering frameworks for solving complex problems in the field of hierarchical modular systems. The frameworks consist of combinatorial problems (and corresponding models), which are interconnected/linked (e.g., by preference relation). Mainly, hierarchical morphological system model is used. The list of basic standard combinatorial engineering (technological) frameworks is the following: (CARDINAL) design of system hierarchical model, (CARDINAL) combinatorial synthesis ('bottom-up' process for system design), (CARDINAL) system evaluation, (CARDINAL) detection of system bottlenecks, (CARDINAL) system improvement (re-design, upgrade), (CARDINAL) multi-stage design (design of system trajectory), (CARDINAL) combinatorial modeling of system evolution/development and system forecasting. The combinatorial engineering frameworks are targeted to maintenance of some system life cycle stages. The list of main underlaying combinatorial optimization problems involves the following: knapsack problem, multiple-choice problem, assignment problem, spanning trees, morphological clique problem.
The paper described a generalized integrated glance to PERSON packing problems including a brief literature survey and some new problem formulations for the cases of multiset estimates of items. A new systemic viewpoint to PERSON packing problems is suggested: (a) basic element sets (item set, PERSON set, item subset assigned to bin), (b) binary relation over the sets: relation over item set as compatibility, precedence, dominance; relation over items and bins (i.e., correspondence of items to bins). A special attention is targeted to the following versions of PERSON packing problems: (a) problem with multiset estimates of items, (b) problem with colored items (and some close problems). Applied examples of bin packing problems are considered: (i) planning in paper industry (framework of combinatorial problems), (ii) selection of information messages, (iii) packing of messages/information packages in WiMAX communication system (brief description).
1
In this paper, we propose an extremely simple deep model for the unsupervised nonlinear dimensionality reduction -- deep distributed random samplings, which performs like a stack of unsupervised bootstrap aggregating. ORDINAL, its network structure is novel: each layer of the network is a group of mutually independent $k$-centers clusterings. ORDINAL, its learning method is extremely simple: the $PERSON centers of each clustering are MONEYPERSON randomly selected examples from the training data; for small-scale data sets, the $PERSON centers are further randomly reconstructed by a simple cyclic-shift operation. Experimental results on nonlinear dimensionality reduction show that the proposed method can learn abstract representations on both large-scale and small-scale problems, and meanwhile is much faster than deep neural networks on large-scale problems.
Voting is a simple mechanism to combine together the preferences of multiple agents. Agents may try to manipulate the result of voting by mis-reporting their preferences. CARDINAL barrier that might exist to such manipulation is computational complexity. In particular, it has been shown that it is ORG-hard to compute how to manipulate a number of different voting rules. However, ORG-hardness only bounds the worst-case complexity. Recent theoretical results suggest that manipulation may often be easy in practice. In this paper, we study empirically the manipulability of single transferable voting (ORG) to determine if computational complexity is really a barrier to manipulation. NORP was CARDINAL of the ORDINAL voting rules shown to be ORG-hard. It also appears CARDINAL of the harder voting rules to manipulate. We sample a number of distributions of votes including uniform and real world elections. In almost every election in our experiments, it was easy to compute how a single agent could manipulate the election or to prove that manipulation by a single agent was impossible.
0
Decision theory formally solves the problem of rational agents in uncertain worlds if the true environmental probability distribution is known. PERSON's theory of universal induction formally solves the problem of sequence prediction for unknown distribution. We unify both theories and give strong arguments that the resulting universal AIXI model behaves optimal in any computable environment. The major drawback of the AIXI model is that it is uncomputable. To overcome this problem, we construct a modified algorithm ORG, which is still superior to any other time t and space l bounded agent. The computation time of NORP is of the order t x CARDINAL.
PERSON sequence prediction is a scheme to predict digits of binary strings without knowing the underlying probability distribution. We call a prediction scheme informed when it knows the true probability distribution of the sequence. Several new relations between universal PERSON sequence prediction and informed prediction and general probabilistic prediction schemes will be proved. Among others, they show that the number of errors in PERSON prediction is finite for computable distributions, if finite in the informed case. Deterministic variants will also be studied. The most interesting result is that the deterministic variant of PERSON prediction is optimal compared to any other probabilistic or deterministic prediction scheme apart from additive square root corrections only. This makes it well suited even for difficult prediction problems, where it does not suffice when the number of errors is minimal to within some factor greater than one. PERSON's original bound and the ones presented here complement each other in a useful way.
1
In this article, we study the vector meson transitions among the charmonium and bottomonium states with the heavy quark effective theory in an systematic way, and make predictions for the ratios among the vector PERSON widths of a special multiplet to another multiplet. The predictions can be confronted with the experimental data in the future.
In this article, we introduce a P-wave between the diquark and antidiquark explicitly to construct the vector tetraquark currents, and study the vector tetraquark states with the ORG sum rules systematically, and obtain the lowest vector tetraquark masses up to now. The present predictions support assigning the $MONEY, $MONEY, $Y(4390)$ and $Z(4250)$ to be the vector tetraquark states with a relative P-wave between the diquark and antidiquark pair.
1
CARDINAL common type of symmetry is when values are symmetric. For example, if we are assigning colours (values) to nodes (variables) in a graph colouring problem then we can uniformly interchange the colours throughout a colouring. For a problem with value symmetries, all symmetric solutions can be eliminated in polynomial time. However, as we show here, both static and dynamic methods to deal with symmetry have computational limitations. With static methods, pruning all symmetric values is ORG-hard in general. With dynamic methods, we can take exponential time on problems which static methods solve without search.
Some contemporary views of the universe assume information and computation to be key in understanding and explaining the basic structure underpinning physical reality. We introduce the PERSON exploring some of the basic arguments giving foundation to these visions. We will focus on the algorithmic and ORG aspects, and how these may fit and support the computable universe hypothesis.
0
The increasing popularity of web-based applications has led to several critical services being provided over the Internet. This has made it imperative to monitor the network traffic so as to prevent malicious attackers from depleting the resources of the network and denying services to legitimate users. This paper has presented a mechanism for protecting a web-server against a distributed denial of service (DDoS) attack. Incoming traffic to the server is continuously monitored and any abnormal rise in the inbound traffic is immediately detected. The detection algorithm is based on a statistical analysis of the inbound traffic on the server and a robust hypothesis testing framework. While the detection process is on, the sessions from the legitimate sources are not disrupted and the load on the server is restored to the normal level by blocking the traffic from the attacking sources. To cater to different scenarios, the detection algorithm has various modules with varying level of computational and memory overheads for their execution. While the approximate modules are fast in detection and involve less overhead, they have lower detection accuracy. The accurate modules involve complex detection logic and hence involve more overhead for their execution, but they have very high detection accuracy. Simulations carried out on the proposed mechanism have produced results that demonstrate effectiveness of the scheme.
A universal inequality that bounds the angular momentum of a body by the square of its size is presented and heuristic physical arguments are given to support it. We prove a version of this inequality, as consequence of GPE equations, for the case of rotating axially symmetric, constant density, bodies. Finally, the physical relevance of this result is discussed.
0
Currently, organizations are transforming their business processes into e-services and service-oriented architectures to improve coordination across sales, marketing, and partner channels, to build flexible and scalable systems, and to reduce integration-related maintenance and development costs. However, this new paradigm is still fragile and lacks many features crucial for building sustainable and progressive computing infrastructures able to rapidly respond and adapt to the always-changing market and environmental business. This paper proposes a novel framework for building sustainable Ecosystem- Oriented Architectures (ORG) using e-service models. The backbone of this framework is an ecosystem layer comprising several computing units whose aim is to deliver universal interoperability, transparent communication, automated management, self-integration, self-adaptation, and security to all the interconnected services, components, and devices in the ecosystem. Overall, the proposed model seeks to deliver a comprehensive and a generic sustainable business IT model for developing agile e-enterprises that are constantly up to new business constraints, trends, and requirements. Future research can improve upon the proposed model so much so that it supports computational intelligence to help in decision making and problem solving.
Currently, cryptography is in wide use as it is being exploited in various domains from data confidentiality to data integrity and message authentication. Basically, cryptography shuffles data so that they become unreadable by unauthorized parties. However, clearly visible encrypted messages, no matter how unbreakable, will arouse suspicions. A better approach would be to hide the very existence of the message using steganography. Fundamentally, steganography conceals secret data into innocent-looking mediums called carriers which can then travel from the sender to the receiver safe and unnoticed. This paper proposes a novel steganography scheme for hiding digital data into uncompressed image files using a randomized algorithm and a context-free grammar. Besides, the proposed scheme uses CARDINAL mediums to deliver the secret data: a carrier image into which the secret data are hidden into random pixels, and a well-structured LANGUAGE text that encodes the location of the random carrier pixels. The LANGUAGE text is generated at runtime using a context-free grammar coupled with a lexicon of LANGUAGE words. The proposed scheme is stealthy, and hard to be noticed, detected, and recovered. Experiments conducted showed how the covering and the uncovering processes of the proposed scheme work. As future work, a semantic analyzer is to be developed so as to make the LANGUAGE text medium semantically correct, and consequently safer to be transmitted without drawing any attention.
1
Following a review of metric, ultrametric and generalized ultrametric, we review their application in data analysis. We show how they allow us to explore both geometry and topology of information, starting with measured data. Some themes are then developed based on the use of metric, ultrametric and generalized ultrametric in logic. In particular we study approximation chains in an ultrametric or generalized ultrametric context. Our aim in this work is to extend the scope of data analysis by facilitating reasoning based on the data analysis; and to show how quantitative and qualitative data analysis can be incorporated into logic programming.
Innovation is slowing greatly in the pharmaceutical sector. It is considered here how part of the problem is due to overly limiting intellectual property relations in the sector. On the other hand, computing and software in particular are characterized by great richness of intellectual property frameworks. Could the intellectual property ecosystem of computing come to the aid of the biosciences and life sciences? We look at how the answer might well be yes, by looking at (i) the extent to which a drug mirrors a software program, and (ii) what is to be gleaned from trends in research publishing in the life and biosciences.
1
In this paper we investigate the opportunities offered by the new LOC gravity models from the dedicated ORG and, especially, ORG missions to the project of measuring the general relativistic PERSON effect with a new LOC's artificial satellite. It turns out that it would be possible to abandon the stringent, and expensive, requirements on the orbital geometry of the originally prosed PERSON mission (same semimajor axis a=12270 km of the existing LAGEOS and inclination i=70 deg) by inserting the new spacecraft in a relatively low, and cheaper, orbit (a=7500-8000 km, i\sim 70 deg) and suitably combining its node PERSON with those of ORG and LAW in order to cancel out the ORDINAL even zonal harmonic coefficients of the multipolar expansion of the terrestrial gravitational potential J_2, J_4 along with their temporal variations. The total systematic error due to the mismodelling in the remaining even zonal harmonics would amount to \sim PERCENT and would be insensitive to departures of the inclination from the originally proposed value of many degrees. No semisecular long-period perturbations would be introduced because the period of the node, which is also the period of the solar PRODUCT tidal perturbation, would amount to \sim DATE. Since the coefficient of the node of the new satellite would be smaller than CARDINAL for such low altitudes, the impact of the non-gravitational perturbations of it on the proposed combination would be negligible. Then, a particular financial and technological effort for suitably building the satellite in order to minimize the non-conservative accelerations would be unnecessary.
We study a general relativistic gravitomagnetic CARDINAL-body effect induced by the spin angular momentum ${\boldsymbol S}_\textrm{X}$ of a rotating mass $PERSON orbited at distance $r_\textrm{X}$ by a local gravitationally bound restricted CARDINAL-body system $\mathcal{S}$ of size CARDINALr\ll r_\textrm{X}$ consisting of a test particle revolving around a massive body $M$. At the lowest post-Newtonian order, we analytically work out the doubly averaged rates of change of the NORP orbital elements of the test particle by finding non-vanishing long-term effects for the inclination $I$, the node CARDINALPERSON and the pericenter $\omega$. Such theoretical results are confirmed by a numerical integration of the equations of motion for a fictitious CARDINAL-body system. We numerically calculate the magnitudes of the NORP gravitomagnetic CARDINAL-body precessions for some astronomical scenarios in our solar system. For putative man-made orbiters of the natural moons NORP and LOC in the external fields of PRODUCT and LOC, the relativistic precessions due to the angular momenta of the gaseous giant planets can be MONEY\simeq CARDINAL~per~NORP~yr}^{-1}\right)$. A preliminary numerical simulation shows that, for certain orbital configurations of a hypothetical LOC orbiter, its range-rate signal $MONEY can become larger than the current PERSON accuracy of the existing spacecraft PRODUCT at LOC, i.e. $MONEY~s}^{-1}$, after QUANTITY The effects induced by the ORG's angular momentum on artificial probes of ORG and the LOC are at the level of $PERSONTIME~per~year}~\left(\mu\textrm{as~yr}^{-1}\right)$.
1
The relationship between Popper spaces (conditional probability spaces that satisfy some regularity conditions), lexicographic probability systems (ORG's), and nonstandard probability spaces (ORG's) is considered. If countable additivity is assumed, Popper spaces and a subclass of ORG's are equivalent; without the assumption of countable additivity, the equivalence no longer holds. If the state space is finite, ORG's are equivalent to ORG's. However, if the state space is infinite, ORG's are shown to be more general than ORG's.
Despite the several successes of deep learning systems, there are concerns about their limitations, discussed most recently by PERSON. This paper discusses PERSON's concerns and some others, together with solutions to several of these problems provided by the "P theory of intelligence" and its realisation in the "SP computer model". The main advantages of the NORP system are: relatively small requirements for data and the ability to learn from a single experience; the ability to model both hierarchical and non-hierarchical structures; strengths in several kinds of reasoning, including `commonsense' reasoning; transparency in the representation of knowledge, and the provision of an audit trail for all processing; the likelihood that the NORP system could not be fooled into bizarre or eccentric recognition of stimuli, as deep learning systems can be; the NORP system provides a robust solution to the problem of `catastrophic forgetting' in deep learning systems; the NORP system provides a theoretically-coherent solution to the problems of correcting over- and under-generalisations in learning, and learning correct structures despite errors in data; unlike most research on deep learning, the NORP programme of research draws extensively on research on human learning, perception, and cognition; and the NORP programme of research has an overarching theory, supported by evidence, something that is largely missing from research on deep learning. In general, the NORP system provides a much firmer foundation than deep learning for the development of artificial general intelligence.
0
We present a quantum-like (PERSON) model in that contexts (complexes of e.g. mental, social, biological, economic or even political conditions) are represented by complex probability amplitudes. This approach gives the possibility to apply the mathematical quantum formalism to probabilities induced in any domain of science. In our model ORG randomness appears not as irreducible randomness (as it is commonly accepted in conventional quantum mechanics, e.g., by PERSON and Dirac), but as a consequence of obtaining incomplete information about a system. We pay main attention to the PERSON description of processing of incomplete information. Our PERSON model can be useful in cognitive, social and political sciences as well as economics and artificial intelligence. In this paper we consider in a more detail CARDINAL special application -- PERSON modeling of brain's functioning. The brain is modeled as a PERSON-computer.
The paper describes a general glance to the use of element exchange techniques for optimization over permutations. A multi-level description of problems is proposed which is a fundamental to understand nature and complexity of optimization problems over permutations (e.g., ordering, scheduling, traveling salesman problem). The description is based on permutation neighborhoods of several kinds (e.g., by improvement of an objective function). Our proposed operational digraph and its kinds can be considered as a way to understand convexity and polynomial solvability for combinatorial optimization problems over permutations. Issues of an analysis of problems and a design of hierarchical heuristics are discussed. The discussion leads to a multi-level adaptive algorithm system which analyzes an individual problem and selects/designs a solving strategy (trajectory).
0
This article presents an overview of computability logic -- the game-semantically constructed logic of interactive computational tasks and resources. There is CARDINAL non-overview, technical section in it, devoted to a proof of the soundness of affine logic with respect to the semantics of computability logic. A comprehensive online source on the subject can be found at ORG
Computability logic (CL) is a systematic formal theory of computational tasks and resources, which, in a sense, can be seen as a semantics-based alternative to (the syntactically introduced) linear logic. With its expressive and flexible language, where formulas represent computational problems and "truth" is understood as algorithmic solvability, ORG potentially offers a comprehensive logical basis for constructive applied theories and computing systems inherently requiring constructive and computationally meaningful underlying logics. Among the best known constructivistic logics is ORG's intuitionistic calculus ORG, whose language can be seen as a special fragment of that of ORG. The constructivistic philosophy of ORG, however, has never really found an intuitively convincing and mathematically strict semantical justification. CL has good claims to provide such a justification and hence a materialization of ORG's known thesis "INT = logic of problems". The present paper contains a soundness proof for ORG with respect to the ORG semantics. A comprehensive online source on ORG is available at ORG
1
General purpose intelligent learning agents cycle through (complex,ORG) sequences of observations, actions, and rewards. On the other hand, reinforcement learning is well-developed for small finite state PERSON Processes (MDPs). So far it is an art performed by human designers to extract the right state representation out of the bare observations, i.e. to reduce the agent setup to the ORG framework. Before we can think of mechanizing this search for suitable MDPs, we need a formal objective criterion. The main contribution of this article is to develop such a criterion. I also integrate the various parts into CARDINAL learning algorithm. Extensions to more realistic dynamic NORP networks are developed in a companion article.
The impact of the latest combined ORG/GRACE/terrestrial measurements LOC gravity model ORG-CG03C on the measurement of the Lense-Thirring effect with some ORG combinations of the nodes of some of the existing LOC's artificial satellites is presented. The CARDINAL-sigma upper bound of the systematic error in the node-node LAGEOS-LAGEOS II combination is PERCENT (PERCENT with ORG-GRACE02S, \sim PERCENT with ORG-CG01C and \sim PERCENT with ORG), while it is DATE for the node-only LAGEOS-LAGEOS II-Ajisai-Jason-1 combination (PERCENT with ORG-GRACE02S, PERCENT with ORG-CG01C and PERCENT with ORG).
0
We explore a simple mathematical model of network computation, based on PERSON chains. Similar models apply to a broad range of computational phenomena, arising in networks of computers, as well as in genetic, and neural nets, in social networks, and so on. The main problem of interaction with such spontaneously evolving computational systems is that the data are not uniformly structured. An interesting approach is to try to extract the semantical content of the data from their distribution among the nodes. A concept is then identified by finding the community of nodes that share it. The task of data structuring is thus reduced to the task of finding the network communities, as groups of nodes that together perform some non-local data processing. Towards this goal, we extend the ranking methods from nodes to paths. This allows us to extract some information about the likely flow biases from the available static information about the network.
Dialectical logic is the logic of dialectical processes. The goal of dialectical logic is to reveal the dynamical notions inherent in logical computational systems. The fundamental notions of proposition and truth-value in standard logic are subsumed by the notions of process and flow in dialectical logic. Standard logic motivates the core sequential aspect of dialectical logic. Horn-clause logic requires types and nonsymmetry and also motivates the parallel aspect of dialectical logic. The process logics of PERSON and ORG reveal the internal/external aspects of dialectical logic. The sequential internal aspect of dialectical logic should be viewed as a typed or distributed version of GPE's linear logic with ORG tensor. The simplest version of dialectical logic is inherently intuitionistic. However, by following GPE's approach in standard logic using double negation closure, we can define a classical version of dialectical logic.
0
Complementary strands in DNA double helix show temporary fluctuational openings which are essential to biological functions such as transcription and replication of the genetic information. Such large amplitude fluctuations, known as the breathing of DNA, are generally localized and, microscopically, are due to the breaking of the hydrogen bonds linking the base pairs (\emph{bps}). I apply imaginary time path integral techniques to a mesoscopic NORP which accounts for the helicoidal geometry of a short circular DNA molecule. The \emph{bps} displacements with respect to the ground state are interpreted as time dependent paths whose amplitudes are consistent with the model potential for the hydrogen bonds. The portion of the paths configuration space contributing to the partition function is determined by selecting the ensemble of paths which fulfill the ORDINAL law of thermodynamics. Computations of the thermodynamics in the denaturation range show the energetic advantage for the equilibrium helicoidal geometry peculiar of B-DNA. I discuss the interplay between twisting of the double helix and anharmonic stacking along the molecule backbone suggesting an interesting relation between intrinsic nonlinear character of the microscopic interactions and molecular topology.
We develop ORG-logitboost, based on the prior work on ORG-boost and robust logitboost. Our extensive experiments on a variety of datasets demonstrate the considerable improvement of ORG-logitboost over logitboost and ORG.
0
More than a speculative technology, ORG computing seems to challenge our most basic intuitions about how the physical world should behave. In this thesis I show that, while some intuitions from classical computer science must be jettisoned in the light of modern physics, many others emerge nearly unscathed; and I use powerful tools from computational complexity theory to help determine which are which.
A celebrated DATE theorem of PERSON asserts that honest, rational NORP agents with common priors will never "agree to disagree": if their opinions about any topic are common knowledge, then those opinions must be equal. Economists have written numerous papers examining the assumptions behind this theorem. But CARDINAL key questions went unaddressed: ORDINAL, can the agents reach agreement after a conversation of reasonable length? ORDINAL, can the computations needed for that conversation be performed efficiently? This paper answers both questions in the affirmative, thereby strengthening PERSON's original conclusion. We ORDINAL show that, for CARDINAL agents with a common prior to agree within epsilon about the expectation of a [CARDINAL] variable with high probability over their prior, it suffices for them to exchange order CARDINAL/epsilon^2 bits. This bound is completely independent of the number of bits n of relevant knowledge that the agents have. We then extend the bound to CARDINAL or more agents; and we give an example where the economists' "standard protocol" (which consists of repeatedly announcing one's current expectation) nearly saturates the bound, while a new "attenuated protocol" does better. Finally, we give a protocol that would cause CARDINAL NORP to agree within epsilon after exchanging order CARDINAL/epsilon^2 messages, and that can be simulated by agents with limited computational resources. By this we mean that, after examining the agents' knowledge and a transcript of their conversation, no one would be able to distinguish the agents from perfect NORP. The time used by the simulation procedure is exponential in CARDINAL/epsilon^6 but not in n.
1
This paper studies sequence prediction based on the monotone NORP complexity NORP m, i.e. based on universal deterministic/CARDINAL-part ORG. m is extremely close to PERSON's prior M, the latter being an excellent predictor in deterministic as well as probabilistic environments, where performance is measured in terms of convergence of posteriors or losses. Despite this closeness to M, it is difficult to assess the prediction quality of m, since little is known about the closeness of their posteriors, which are the important quantities for prediction. We show that for deterministic computable environments, the "posterior" and losses of m converge, but rapid convergence could only be shown on-sequence; the off-sequence behavior is unclear. In probabilistic environments, neither the posterior nor the losses converge, in general.
I present a NORP model and a computational method suitable to evaluate structural and thermodynamic properties of helical molecules embedded in crowded environments which may confine the space available to the base pair fluctuations. It is shown that, for the specific case of a short DNA fragment in a nanochannel, the molecule is markedly over-twisted and stretched by narrowing the width of the channel.
0
In this article, we study the axial-vector tetraquark state and QUANTITY mixed state consist of light quarks using the ORG sum rules. The present predictions disfavor assigning the $MONEY as the axial-vector tetraquark state with $PERSON, while support assigning the $MONEY as the axial-vector MONEY mixed state.
In this article, we study the radiative transitions among the vector and scalar heavy quarkonium states with the covariant light-front quark model. In calculations, we observe that the radiative decay widths are sensitive to the constituent quark masses and the shape parameters of the wave-functions, and reproduce the experimental data with suitable parameters.
1
This is a chapter in a book \emph{Quantum Error Correction} edited by D. A. FAC and PERSON, and published by ORG (CARDINAL (http://www.cambridge.org/us/academic/subjects/physics/quantum-physics-quantum-information-and-quantum-computation/quantum-error-correction)\\ presenting the author's view on feasibility of fault-tolerant quantum information processing.
Recent papers by DATE and NORP have emphasized that wormholes supported by arbitrarily small amounts of exotic matter will have to be incredibly fine-tuned if they are to be traversable. This paper discusses a wormhole model that strikes a balance between CARDINAL conflicting requirements, reducing the amount of exotic matter and fine-tuning the metric coefficients, ultimately resulting in an engineering challenge: CARDINAL requirement can only be met at the expense of the other. The wormhole model is macroscopic and satisfies various traversability criteria.
0
This letter introduces a new, substantially simplified version of the branching recurrence operation of computability logic (see ORG), and proves its equivalence to the old, "canonical" version.
The earlier paper "Introduction to clarithmetic I" constructed an axiomatic system of arithmetic based on computability logic (see ORG), and proved its soundness and extensional completeness with respect to polynomial time computability. The present paper elaborates CARDINAL additional sound and complete systems in the same style and sense: CARDINAL for polynomial space computability, CARDINAL for elementary recursive time (and/or space) computability, and one for primitive recursive time (and/or space) computability.
1
PERSON, who does not have any sophisticated quantum technology, delegates her ORG computing to PERSON, who has a fully-fledged ORG computer. Can she check whether the computation PERSON performs for her is correct? She cannot recalculate the result by herself, since she does not have any quantum computer. A recent experiment with photonic qubits suggests she can. Here, I explain the basic idea of the result, and recent developments about secure cloud ORG computing.
PERSON is usually defined as a subfield of ORG, which is busy with information extraction from raw data sets. Despite of its common acceptance and widespread recognition, this definition is wrong and groundless. Meaningful information does not belong to the data that bear it. It belongs to the observers of the data and it is a shared agreement and a convention among them. Therefore, this private information cannot be extracted from the data by any means. Therefore, all further attempts of ORG apologists to justify their funny business are inappropriate.
0
Purpose: To compare CARDINAL major Web search engines (ORG, ORG, ORG, ORG, and ORG) for their retrieval effectiveness, taking into account not only the results but also the results descriptions. Design/Methodology/Approach: The study uses real-life queries. Results are made anonymous and are randomised. Results are judged by the persons posing the original queries. Findings: The CARDINAL major search engines, ORG and ORG, perform best, and there are no significant differences between them. ORG delivers significantly more relevant result descriptions than any other search engine. This could be CARDINAL reason for users perceiving this engine as superior. Research Limitations: The study is based on a user model where the user takes into account a certain amount of results rather systematically. This may not be the case in real life. Practical Implications: Implies that search engines should focus on relevant descriptions. Searchers are advised to use other search engines in addition to ORG. Originality/Value: This is the ORDINAL major study comparing results and descriptions systematically and proposes new retrieval measures to take into account results descriptions
The path to greater diversity, as we have seen, cannot be achieved by merely hoping for a new search engine nor will government support for a single alternative achieve this goal. What is instead required is to create the conditions that will make establishing such a search engine possible in the ORDINAL place. I describe how building and maintaining a proprietary index is the greatest deterrent to such an undertaking. We must ORDINAL overcome this obstacle. Doing so will still not solve the problem of the lack of diversity in the search engine marketplace. But it may establish the conditions necessary to achieve that desired end.
1
Nowadays folksonomy is used as a system derived from user-generated electronic tags or keywords that annotate and describe online content. But it is not a classification system as an ontology. To consider it as a classification system it would be necessary to share a representation of contexts by all the users. This paper is proposing the use of folksonomies and network theory to devise a new concept: a "WORK_OF_ART" to represent folksonomies. This paper proposed and analyzed the network structure of PERSON tags thought as folsksonomy tags suggestions for the user on a dataset built on chosen websites. It is observed that the PRODUCT has relative low path lengths checking it with classic networking measures (clustering coefficient). Experiment result shows it can facilitate serendipitous discovery of content among users. Neat examples and clear formulas can show how a "WORK_OF_ART" can be used to tackle ontology mapping challenges.
Information retrieval is not only the most frequent application executed on the Web but it is also the base of different types of applications. Considering collective intelligence of groups of individuals as a framework for evaluating and incorporating new experiences and information we often cannot retrieve such knowledge being tacit. ORG knowledge underlies many competitive capabilities and it is hard to articulate on discrete ontology structure. It is unstructured or unorganized, and therefore remains hidden. Developing generic solutions that can find the hidden knowledge is extremely complex. Moreover this will be a great challenge for the developers of semantic technologies. This work aims to explore ways to make explicit and available the tacit knowledge hidden in the collective intelligence of a collaborative environment within organizations. The environment was defined by folksonomies supported by a faceted semantic search. Vector space model which incorporates an analogy with the mathematical apparatus of quantum theory is adopted for the representation and manipulation of the meaning of folksonomy. Vector space retrieval has been proven efficiency when there isn't a data behavioural because it bears ranking algorithms involving a small number of types of elements and few operations. A solution to find what the user has in mind when posing a query could be based on "joint meaning" understood as a joint construal of the creator of the contents and the reader of the contents. The joint meaning was proposed to deal with vagueness on ontology of folksonomy indeterminacy, incompleteness and inconsistencies on collective intelligence. A proof-of concept prototype was built for collaborative environment as evolution of the actual social networks (like GPE, GPE,..) using the information visualization on a ORG application with ORG techniques and technologies.
1
A chiral field theory of $MONEY glueball is presented. The coupling between the quark operator and the $MONEY glueball field is revealed from the ORG) anomaly. The NORP of this theory is constructed by adding a $MONEY glueball field to a successful NORP of chiral field theory of pseudoscalar, vector, and axial-vector mesons. Quantitative study of the physical processes of the $MONEY glueball of $m=1.405\textrm{GeV}$ is presented. The theoretical predictions can be used to identify the $MONEY glueball.
Based on an effective chiral theory of pseudoscalar, vector, and axial-vector mesons, the coefficients of the chiral perturbation theory are predicted. There is no new parameter in these predictions.
1
The scope of this teaching package is to make a brief introduction to some notions and properties of chaotic systems. We ORDINAL make a brief introduction to chaos in general and then we show some important properties of chaotic systems using the logistic map and its bifurcation diagram. We also show the universality found in "the route to chaos". The user is only required to have notions of algebra, so it is quite accessible. The formal basis of chaos theory are not covered in this introduction, but are pointed out for the reader interested in them. Therefore, this package is also useful for people who are interested in going deep into the mathematical theories, because it is a simple introduction of the terminology, and because it points out which are the original sources of information (so there is no danger in falling in the trap of "WORK_OF_ART in TIME" or "Bifurcation Diagrams for Dummies"). The included exercises are suggested for consolidating the covered topics. The on-line resources are highly recommended for extending this brief induction.
This paper discusses the benefits of describing the world as information, especially in the study of the evolution of life and cognition. Traditional studies encounter problems because it is difficult to describe life and cognition in terms of matter and energy, since their laws are valid only at the physical scale. However, if matter and energy, as well as life and cognition, are described in terms of information, evolution can be described consistently as information becoming more complex. The paper presents CARDINAL tentative laws of information, valid at multiple scales, which are generalizations of NORP, cybernetic, thermodynamic, psychological, philosophical, and complexity principles. These are further used to discuss the notions of life, cognition and their evolution.
1
Cirquent calculus is a novel proof theory permitting component-sharing between logical expressions. Using it, the predecessor article "Elementary-base cirquent calculus I: Parallel and choice connectives" built the sound and complete axiomatization CL16 of a propositional fragment of computability logic (see http://www.csc.villanova.edu/~japaridz/CL/ ). The atoms of the language of CL16 represent elementary, i.e., moveless, games, and the logical vocabulary consists of negation, parallel connectives and choice connectives. The present paper constructs the ORDINAL-order version CL17 of ORG, also enjoying soundness and completeness. The language of CL17 augments that of CL18 by including choice quantifiers. Unlike classical predicate calculus, CL17 turns out to be decidable.
Clarithmetics are number theories based on computability logic (see http://www.csc.villanova.edu/~japaridz/CL/ ). Formulas of these theories represent interactive computational problems, and their "truth" is understood as existence of an algorithmic solution. Various complexity constraints on such solutions induce various versions of clarithmetic. The present paper introduces a parameterized/schematic version PRODUCT). By tuning the CARDINAL parameters P1,P2,P3 in an essentially mechanical manner, CARDINAL automatically obtains sound and complete theories with respect to a wide range of target tricomplexity classes, i.e. combinations of time (set by ORG), space (set by PERSON) and so called amplitude (set by CARDINAL) complexities. Sound in the sense that every theorem T of the system represents an interactive number-theoretic computational problem with a solution from the given tricomplexity class and, furthermore, such a solution can be automatically extracted from a proof of NORP And complete in the sense that every interactive number-theoretic problem with a solution from the given tricomplexity class is represented by some theorem of the system. Furthermore, through tuning the ORDINAL parameter CARDINAL, at the cost of sacrificing recursive axiomatizability but not simplicity or elegance, the above extensional completeness can be strengthened to intensional completeness, according to which every formula representing a problem with a solution from the given tricomplexity class is a theorem of the system. This article is published in CARDINAL parts. The previous Part I has introduced the system and proved its completeness, while the present Part II is devoted to proving soundness.
1
In the same sense as classical logic is a formal theory of truth, the recently initiated approach called computability logic is a formal theory of computability. It understands (interactive) computational problems as games played by a machine against the environment, their computability as existence of a machine that always wins the game, logical operators as operations on computational problems, and validity of a logical formula as being a scheme of "always computable" problems. The present contribution gives a detailed exposition of a soundness and completeness proof for an axiomatization of CARDINAL of the most basic fragments of computability logic. The logical vocabulary of this fragment contains operators for the so called parallel and choice operations, and its atoms represent elementary problems, i.e. predicates in the standard sense. This article is self-contained as it explains all relevant concepts. While not technically necessary, however, familiarity with the foundational paper "Introduction to computability logic" [WORK_OF_ART and ORG (DATE), CARDINAL] would greatly help the reader in understanding the philosophy, underlying motivations, potential and utility of computability logic, -- the context that determines the value of the present results. Online introduction to the subject is available at ORG and http://www.csc.villanova.edu/~japaridz/CL/gsoll.html .
We consider the state dependent channels with full state information with at the sender and partial state information at the receiver. For this state dependent channel, the channel capacity under rate constraint on the state information at the decoder was determined by PERSON. In this paper, we study the correct probability of decoding at rates above the capacity. We prove that when the transmission rate is above the capacity this probability goes to CARDINAL exponentially and derive an explicit lower bound of this exponent function.
0
We extend the algebra of reversible computation to support ORG computing. Since the algebra is based on true concurrency, it is reversible for quantum computing and it has a sound and complete theory.
We have unified quantum and classical computing in open ORG systems called NORP which is a quantum generalization of process algebra ORG. But, an axiomatization for quantum and classical processes with an assumption of closed ORG systems is still missing. For closed ORG, unitary operator, ORG measurement and ORG are CARDINAL basic components for ORG computing. This leads to probability unavoidable. Along the solution of NORP to unify quantum and classical computing in open ORG, we unify quantum and classical computing with an assumption of closed systems under the framework of ORG-like probabilistic process algebra. This unification make it can be used widely in verification for quantum and classical computing mixed systems, such as most quantum communication protocols.
1
A major challenge of interdisciplinary description of complex system behaviour is whether real systems of higher complexity levels can be understood with at least the same degree of objective, "scientific" rigour and universality as "simple" systems of classical, NORP science paradigm. The problem is reduced to that of arbitrary, many-body interaction (unsolved in standard theory). Here we review its causally complete solution, the ensuing concept of complexity and applications. The discovered key properties of dynamic multivaluedness and entanglement give rise to a qualitatively new kind of mathematical structure providing the exact version of real system behaviour. The extended mathematics of complexity contains the truly universal definition of dynamic complexity, randomness (chaoticity), classification of all possible dynamic regimes, and the unifying principle of any system dynamics and evolution, the universal symmetry of complexity. Every real system has a non-zero (and actually high) value of unreduced dynamic complexity determining, in particular, "mysterious" behaviour of ORG systems and relativistic effects causally explained now as unified manifestations of complex interaction dynamics. The observed differences between various systems are due to different regimes and levels of their unreduced dynamic complexity. We outline applications of universal concept of dynamic complexity emphasising cases of "truly complex" systems from higher complexity levels (ecological and living systems, brain operation, intelligence and consciousness, autonomic information and communication systems) and show that the urgently needed progress in social and intellectual structure of civilisation inevitably involves qualitative transition to unreduced complexity understanding (we call it "revolution of complexity").
This paper examines whether unitary evolution alone is sufficient to explain emergence of the classical world from the perspective of computability theory. Specifically, it looks at the problem of how the choice related to the measurement is made by the observer viewed as a quantum system. In interpretations where the system together with the observers is completely described by unitary transformations, the observer cannot make any choices and so measurement is impossible. From the perspective of computability theory, a ORG machine cannot halt and so it cannot observe the computed state, indicating that unitarity alone does not explain all matter processes. Further it is argued that the consideration of information and observation requires an overarching system of knowledge and expectations about outcomes.
0
We calculate the limiting behavior of relative NORP entropy when the ORDINAL probability distribution is close to the ORDINAL one in a non-regular location-shift family which is generated by a probability distribution whose support is an interval or a CARDINAL-line. This limit can be regarded as a generalization of ORG information, and plays an important role in large deviation theory.
We derive a new upper bound for PERSON's information in secret key generation from a common random number without communication. This bound improves on PERSON et al(1995)'s bound based on the R\'enyi entropy of order CARDINAL because the bound obtained here uses the R\'enyi entropy of order $MONEY for $s \in [0,1]$. This bound is applied to a wire-tap channel. Then, we derive an exponential upper bound for PERSON's information. Our exponent is compared with Hayashi(2006)'s exponent. For the additive case, the bound obtained here is better. The result is applied to secret key agreement by public discussion.
1
The theory of rational choice assumes that when people make decisions they do so in order to maximize their utility. In order to achieve this goal they ought to use all the information available and consider all the choices available to choose an optimal choice. This paper investigates what happens when decisions are made by artificially intelligent machines in the market rather than human beings. ORDINAL, the expectations of the future are more consistent if they are made by an artificially intelligent machine and the decisions are more rational and thus marketplace becomes more rational.
This paper proposes the response surface method for finite element model updating. The response surface method is implemented by approximating the finite element model surface response equation by a multi-layer perceptron. The updated parameters of the finite element model were calculated using genetic algorithm by optimizing the surface response equation. The proposed method was compared to the existing methods that use simulated annealing or genetic algorithm together with a full finite element model for finite element model updating. The proposed method was tested on an unsymmetri-cal H-shaped structure. It was observed that the proposed method gave the updated natural frequen-cies and mode shapes that were of the same order of accuracy as those given by simulated annealing and genetic algorithm. Furthermore, it was observed that the response surface method achieved these results at a computational speed that was CARDINAL times as fast as the genetic algorithm and a full finite element model and CARDINAL times faster than the simulated annealing.
1
The launching of NORP and ORG, and methodological developments in ORG have made many more indicators for evaluating journals available than the traditional ORG, Cited Half-life, and Immediacy Index of the ORG. In this study, these new indicators are compared with one another and with the older ones. Do the various indicators measure new dimensions of the citation networks, or are they highly correlated among them? Are they robust and relatively stable over time? CARDINAL main dimensions are distinguished -- size and impact -- which together shape influence. The H-index combines the CARDINAL dimensions and can also be considered as an indicator of reach (like NORP). ORG is mainly an indicator of size, but has important interactions with centrality measures. ORG (ORG) indicator provides an alternative to ORG, but the computation is less easy.
One can study communications by using FAC's (DATE) mathematical theory of communication. In social communications, however, the channels are not "fixed", but themselves subject to change. Communication systems change by communicating information to related communication systems; co-variation among systems if repeated over time, can lead to co-evolution. Conditions for stabilization of higher-order systems are specifiable: segmentation, stratification, differentiation, reflection, and self-organization can be distinguished in terms of developmental stages of increasingly complex networks. In addition to natural and cultural evolution, a condition for the artificial evolution of communication systems can be specified.
1
We explore multi-terminal quantum transport through a benzene molecule threaded by an LOC flux $\phi$. A simple tight-binding model is used to describe the system and all the calculations are done based on the PERSON's function formalism. With a brief description of CARDINAL-terminal quantum transport, we present a detailed study of CARDINAL-terminal transport properties through the benzene molecule to reveal the actual mechanism of electron transport. Here we numerically compute the multi-terminal conductances, reflection probabilities and current-voltage characteristics in the aspects of molecular coupling strength and magnetic flux $MONEY Most significantly we observe that, the molecular system where the benzene molecule is attached to CARDINAL terminals can be operated as a transistor, and we call it a molecular transistor. This aspect can be utilized in designing nano-electronic circuits and our investigation may provide a basic framework to study electron transport in any complicated multi-terminal quantum system.
Computability logic is a formal theory of computational tasks and resources. Its formulas represent interactive computational problems, logical operators stand for operations on computational problems, and validity of a formula is understood as being a scheme of problems that always have algorithmic solutions. A comprehensive online source on the subject is available at ORG . The earlier article "Propositional computability logic I" proved soundness and completeness for the (in a sense) minimal nontrivial fragment CL1 of computability logic. The present paper extends that result to the significantly more expressive propositional system CL2. What makes CL2 more expressive than CL1 is the presence of CARDINAL sorts of atoms in its language: elementary atoms, representing elementary computational problems (i.e. predicates), and general atoms, representing arbitrary computational problems. CL2 conservatively extends CL1, with the latter being nothing but the general-atom-free fragment of the former.
0
We analyze electroproduction of light vector meson at small GPE $x$ within the generalized parton distribution (ORG) approach. Calculation is based on the modified perturbative approach, where the quark transverse degrees of freedom in the hard subprocess are considered. Our results on the cross section are in fair agreement with experiment from GPE to ORG energies.
The term "PRODUCT kernel" stands for correlation-resemblance kernel. In many applications (e.g., vision), the data are often high-dimensional, sparse, and non-binary. We propose CARDINAL types of (nonlinear) PRODUCT kernels for non-binary sparse data and demonstrate the effectiveness of the new kernels through a classification experiment. PRODUCT kernels are simple with no tuning parameters. However, training nonlinear kernel ORG can be (very) costly in time and memory and may not be suitable for truly large-scale industrial applications (e.g. search). In order to make the proposed PRODUCT kernels more practical, we develop basic probabilistic hashing algorithms which transform nonlinear kernels into ORG kernels.
0
The complementary roles played by parallel quantum computation and quantum measurement in originating the quantum speed-up are illustrated through an analogy with a famous metaphor by ORG.
The topical quantum computation paradigm is a transposition of the ORG machine into the quantum framework. Implementations based on this paradigm have limitations as to the number of: qubits, computation steps, efficient quantum algorithms (found so far). A new exclusively ORG paradigm (with no classical counterpart) is propounded, based on the speculative notion of continuous uncomplete von ORG measurement. Under such a notion, ORG-complete is equal to P. This can provide a mathematical framework for the search of implementable paradigms, possibly exploiting particle statistics.
1
Data processing lower bounds on the expected distortion are derived in the finite-alphabet semi-deterministic setting, where the source produces a deterministic, individual sequence, but the channel model is probabilistic, and the decoder is subjected to various kinds of limitations, e.g., decoders implementable by finite-state machines, with or without counters, and with or without a restriction of common reconstruction with high probability. Some of our bounds are given in terms of the Lempel-Ziv complexity of the source sequence or the reproduction sequence. We also demonstrate how some analogous results can be obtained for classes of ORG encoders and linear decoders in the continuous alphabet case.
This document consists of lecture notes for a graduate course, which focuses on the relations between Information Theory and Statistical Physics. The course is aimed at EE graduate students in the area of ORG, as well as to graduate students in ORG who have basic background in ORG. Strong emphasis is given to the analogy and parallelism between ORG, as well as to the insights, the analysis tools and techniques that can be borrowed from ORG and `imported' to certain problem areas in ORG. This is a research trend that has been very active in DATE, and the hope is that by exposing the student to the meeting points between these CARDINAL disciplines, we will enhance his/her background and perspective to carry out research in the field. A short outline of the course is as follows: Introduction; PERSONORG and its ORG; PERSON in ORG; Systems of Interacting Particles and ORG; ORG (ORG) and ORG; Additional Topics (optional).
1
The problem of calculating multicanonical parameters recursively is discussed. I describe in detail a computational implementation which has worked reasonably well in practice.
According to contemporary views, equilibrium constant is relevant only to true thermodynamic equilibria in isolated systems with CARDINAL chemical reaction. The paper presents a novel formula that ties-up equilibrium constant and chemical system composition at any state, isolated or open as well. Extending the logarithmic logistic map of ORG, this formula maps the system population at isolated equilibrium into the population at any open equilibrium at p,T=const, using equilibrium constant as a measure. Real chemical systems comprise multiple subsystems; given the resources are limited, joint solution to the set of such expressions, each relevant to a specific subsystem, gives equilibrium composition for each of them. This result means a fundamental break through in the open systems thermodynamics and leads to formerly unknown opportunities in the analysis of real chemical objects.
0
A file repository for calculations of cross sections and kinematic distributions using PERSON generators for high-energy collisions is discussed. The repository is used to facilitate effective preservation and archiving of data from theoretical calculations, as well as for comparisons with experimental data. The ORG data library is publicly accessible and includes a number of PERSON event samples with PERSON predictions for current and future experiments. The ORG project includes a software package to automate the process of downloading and viewing online PERSON event samples. A data streaming over a network for end-user analysis is discussed.
Multiplicity correlations between the current and target regions of the GPE frame in deep-inelastic scattering processes are studied. It is shown that the correlations are sensitive to the ORDINAL-order perturbative ORG effects and can be used to extract the behavior of the boson-gluon fusion rates as a function of the GPE variable. The behavior of the correlations is derived analytically and analyzed using a PERSON simulation.
1
We analytically work out the long-term orbital perturbations induced by a homogeneous circular ring of radius PERSON and mass mr on the motion of a test particle in the cases (I): r > R_r and (II): r < R_r. In order to extend the validity of our analysis to the orbital configurations of, e.g., some proposed spacecraftbased mission for fundamental physics like ORG and ORG, of possible GPE around the supermassive black hole in ORG* coming from tidal disruptions of incoming gas clouds, and to the effect of artificial space debris belts around the LOC, we do not restrict ourselves to the case in which the ring and the orbit of the perturbed particle lie just in the same plane. From the corrections to the standard secular perihelion precessions, recently determined by a team of astronomers for some planets of the PRODUCT, we infer upper bounds on mr for various putative and known annular matter distributions of natural origin (close circumsolar ring with R_r = CARDINAL-0.13 au, dust ring with R_r = CARDINAL au, minor asteroids, NORP Objects). We find m_r <= CARDINAL CARDINAL^-4 m_E (circumsolar ring with R_r = CARDINAL au), m_r <= DATE^-6 m_E (circumsolar ring with R_r = CARDINAL au), m_r <= DATE^-7 m_E (ring with R_r = CARDINAL au), m_r <= CARDINAL 10^-12 M_S (asteroidal ring with R_r = CARDINAL au), m_r <= CARDINAL <= CARDINAL^PRODUCT (asteroidal ring with R_r = CARDINAL au), m_r <= CARDINAL^-8 M_S (TNOs ring with R_r = CARDINAL au). In principle, our analysis is valid both for baryonic and non-baryonic PERSON distributions.
There is significant concern that technological advances, especially in LOC and ORG (AI), could lead to high levels of unemployment in DATE. Studies have estimated that CARDINAL of all current jobs are at risk of automation. To look into this issue in more depth, we surveyed experts in ORG and ORG about the risk, and compared their views with those of non-experts. Whilst the experts predicted a significant number of occupations were at risk of automation in DATE, they were more cautious than people outside the field in predicting occupations at risk. Their predictions were consistent with their estimates for when computers might be expected to reach human level performance across a wide range of skills. These estimates were typically DATE than those of the non-experts. Technological barriers may therefore provide society with more time to prepare for an automated future than the public fear. In addition, public expectations may need to be dampened about the speed of progress to be expected in GPE and ORG.
0
In a complete metric space that is equipped with a doubling measure and supports a Poincar\'e inequality, we show that functions of bounded variation (BV functions) can be approximated in the strict sense and pointwise uniformly by special functions of bounded variation, without adding significant jumps. As a main tool, we study the variational CARDINAL-capacity and its ORG analog.
We study a stochastic control system, described by Ito controllable equation, and evaluate the solutions by an entropy functional (EF), defined by the equation functions of controllable drift and diffusion. Considering a control problem for this functional, we solve the ORG control variation problem (VP), which leads to both a dynamic approximation of the process entropy functional by an information path functional (ORG) and information dynamic model (IDM) of the stochastic process. The ORG variation equations allow finding the optimal control functions, applied to both stochastic system and the ORG for joint solution of the identification and optimal control problems, combined with state consolidation. In this optimal dual strategy, the ORG optimum predicts each current control action not only in terms of total functional path goal, but also by setting for each following control action the renovated values of this functional controllable drift and diffusion, identified during the optimal movement, which concurrently correct this goal. The VP information invariants allow optimal encoding of the identified dynamic model operator and control. The introduced method of cutting off the process by applying an impulse control estimates the cutoff information, accumulated by the process inner connections between its states. It has shown that such a functional information measure contains more information than the sum of FAC entropies counted for all process separated states, and provides information measure of ORG kernel. Examples illustrate the procedure of solving these problems, which has been implemented in practice. Key words: Entropy and information path functionals, variation equations, information invariants, controllable dynamics, impulse controls, cutting off the diffusion process, identification, cooperation, encoding.
0
This paper provides an overview of the NORP theory of intelligence and its central idea that artificial intelligence, mainstream computing, and much of human perception and cognition, may be understood as information compression. The background and origins of the NORP theory are described, and the main elements of the theory, including the key concept of multiple alignment, borrowed from bioinformatics but with important differences. Associated with the NORP theory is the idea that redundancy in information may be understood as repetition of patterns, that compression of information may be achieved via the matching and unification (merging) of patterns, and that computing and information compression are both fundamentally probabilistic. It appears that the NORP system is Turing-equivalent in the sense that anything that may be computed with a Turing machine may, in principle, also be computed with an NORP machine. CARDINAL of the main strengths of the NORP theory and the multiple alignment concept is in modelling concepts and phenomena in artificial intelligence. Within that area, the NORP theory provides a simple but versatile means of representing different kinds of knowledge, it can model both the parsing and production of natural language, with potential for the understanding and translation of natural languages, it has strengths in pattern recognition, with potential in computer vision, it can model several kinds of reasoning, and it has capabilities in planning, problem solving, and unsupervised learning. The paper includes CARDINAL examples showing how alternative parsings of an ambiguous sentence may be modelled as multiple alignments, and another example showing how the concept of multiple alignment may be applied in medical diagnosis.
This article introduces the idea that probabilistic reasoning (PR) may be understood as "information compression by multiple alignment, unification and search" (ICMAUS). In this context, multiple alignment has a meaning which is similar to but distinct from its meaning in bio-informatics, while unification means a simple merging of matching patterns, a meaning which is related to but simpler than the meaning of that term in logic. A software model, SP61, has been developed for the discovery and formation of 'good' multiple alignments, evaluated in terms of information compression. The model is described in outline. Using examples from the SP61 model, this article describes in outline how the ICMAUS framework can model various kinds of PR including: PR in best-match pattern recognition and information retrieval; CARDINAL-step 'deductive' and 'abductive' PR; inheritance of attributes in a class hierarchy; chains of reasoning (probabilistic decision networks and decision trees, and PR with 'rules'); geometric analogy problems; nonmonotonic reasoning and reasoning with default values; modelling the function of a NORP network.
1
Information is the basic concept of information theory. However, there is no definition of this concept that can encompass all uses of the term information in information theories and beyond. Many question a possibility of such a definition. However, foundations of information theory developed in the context of the general theory of information made it possible to build such a relevant and at the same time, encompassing definition. Foundations of information theory are built in a form of ontological principles, which reflect basic features of information and information processes.
In this thesis I present a virtual laboratory which implements CARDINAL different models for controlling animats: a rule-based system, a behaviour-based system, a concept-based system, a neural network, and a GPE architecture. Through different experiments, I compare the performance of the models and conclude that there is no "best" model, since different models are better for different things in different contexts. The models I chose, although quite simple, represent different approaches for studying cognition. Using the results as an empirical philosophical aid, I note that there is no "best" approach for studying cognition, since different approaches have all advantages and disadvantages, because they study different aspects of cognition from different contexts. This has implications for current debates on "proper" approaches for cognition: all approaches are a bit proper, but none will be "proper enough". I draw remarks on the notion of cognition abstracting from all the approaches used to study it, and propose a simple classification for different types of cognition.
0
We give an elementary review of black holes in string theory. We discuss BPS holes, the microscopic computation of entropy and the `fuzzball' picture of the black hole interior suggested by microstates of the CARDINAL-charge system.
We study the model of massless MONEY electrodynamics with nonconstant coupling, introduced by ORG as the `charge hole'. But we take the boundary of the strong coupling region to be ORDINAL timelike, then spacelike for a distance $MONEY, and then timelike again (to mimic the structure of a black hole). For an incident charge pulse entering this `charge trap' the charge and information get separated. The charge comes out near the endpoint of the singularity. The `information' travels a well localised path through the strong coupling region and comes out later.
1
There are very significant changes taking place in the university sector and in related higher education institutes in many parts of the world. In this work we look at financial data from DATE and DATE from the GPE higher education sector. Situating ourselves to begin with in the context of teaching versus research in universities, we look at the data in order to explore the new divergence between the broad agendas of teaching and research in universities. The innovation agenda has become at least equal to the research and teaching objectives of universities. From the financial data, published in the ORG Higher Education DATE newspaper, we explore the interesting contrast, and very opposite orientations, in specialization of universities in the GPE. We find a polarity in specialism that goes considerably beyond the usual one of research-led elite versus more teaching-oriented new universities. Instead we point to the role of medical/bioscience research income in the former, and economic and business sectoral niche player roles in the latter.
Discussion of "Treelets--An adaptive multi-Scale basis for sparse unordered data" [arXiv:0707.0481]
1
A resonance search has been made in FAC, K^{0}s-pbar and ORG invariant-mass spectra measured with the ORG detector at ORG using an integrated luminosity of CARDINAL pb^{-1}. The search was performed in the central rapidity region of inclusive deep inelastic scattering at an ep centre-of-mass energy of CARDINAL--318 GeV for exchanged photon virtuality, CARDINAL, above CARDINAL GeV^{2}. The results support the existence of a narrow state in ORG and K^{0}s-pbar decay channels, consistent with the pentaquark prediction. No signal was found in the PERSON decay channel.
Starting from the primary representation of neutrosophic information, namely the degree of truth, degree of indeterminacy and degree of falsity, we define a nuanced representation in a penta valued fuzzy space, described by the index of truth, index of falsity, index of ignorance, index of contradiction and index of hesitation. Also, it was constructed an associated penta valued logic and then using this logic, it was defined for the proposed penta valued structure the following operators: union, intersection, negation, complement and dual. Then, the penta valued representation is extended to a hexa valued one, adding the ORDINAL component, namely the index of ambiguity.
0
The paper considers a linear regression model with multiple change-points occurring at unknown times. The ORG technique is very interesting since it allows the parametric estimation, including the change-points, and automatic variable selection simultaneously. The asymptotic properties of the ORG-type (which has as particular case the ORG estimator) and of the adaptive ORG estimators are studied. For this last estimator the oracle properties are proved. In both cases, a model selection criterion is proposed. Numerical examples are provided showing the performances of the adaptive ORG estimator compared to the ORG estimator.
In this paper we are interested in parameters estimation of ORG model when number of parameters increases with sample size. Without any assumption about moments of the model error, we propose and study the seamless MONEY quantile estimator. For this estimator we ORDINAL give the convergence rate. Afterwards, we prove that it correctly distinguishes CARDINAL and nonzero parameters and that the estimators of the nonzero parameters are asymptotically normal. A consistent ORG criterion to select the tuning parameters is given.
1
This empirical study is mainly devoted to comparing CARDINAL tree-based boosting algorithms: mart, ORG, robust logitboost, and ORG-logitboost, for multi-class classification on a variety of publicly available datasets. Some of those datasets have been thoroughly tested in prior studies using a broad range of classification algorithms including ORG, neural nets, and deep learning. In terms of the empirical classification errors, our experiment results demonstrate: CARDINAL. Abc-mart considerably improves mart. CARDINAL Abc-logitboost considerably improves (robust) logitboost. CARDINAL. Robust) logitboost} considerably improves mart on most datasets. CARDINAL Abc-logitboost considerably improves ORG on most datasets. CARDINAL These CARDINAL boosting algorithms (especially ORG-logitboost) outperform ORG on many datasets. CARDINAL Compared to the best deep learning methods, these CARDINAL boosting algorithms (especially ORG-logitboost) are competitive.
Counting is among the most fundamental operations in computing. For example, counting the pth frequency moment has been a very active area of research, in theoretical computer science, databases, and data mining. When p=1, the task (i.e., counting the sum) can be accomplished using a simple counter. PERSON (ORG) is proposed for efficiently computing the pth frequency moment of a data stream signal A_t, where 0<p<=2. ORG is applicable if the streaming data follow the PERSON model, with the restriction that at the time t for the evaluation, A_t[i]>= 0, which includes the strict PERSON model as a special case. For natural data streams encountered in practice, this restriction is minor. The underly technique for ORG is what we call skewed stable random projections, which captures the intuition that, when p=1 a simple counter suffices, and when p = DATE with small ORG, the sample complexity of a counter system should be low (continuously as a function of \Delta). We show at small \Delta the sample complexity (number of projections) k = O(1/\epsilon) instead of O(1/\epsilon^2). PERSON can serve a basic building block for other tasks in statistics and computing, for example, estimation entropies of data streams, parameter estimations using the method of moments and maximum likelihood. Finally, another contribution is an algorithm for approximating the logarithmic norm, \sum_{i=1}^D\log A_t[i], and logarithmic distance. The logarithmic distance is useful in machine learning practice with heavy-tailed data.
1
Computability logic (CL) (see ORG) is a semantical platform and research program for redeveloping logic as a formal theory of computability, as opposed to the formal theory of truth which it has more traditionally been. Formulas in ORG stand for (interactive) computational problems, understood as games between a machine and its environment; logical operators represent operations on such entities; and "truth" is understood as existence of an effective solution, i.e., of an algorithmic winning strategy. The formalism of ORG is open-ended, and may undergo series of extensions as the study of the subject advances. The main groups of operators on which ORG has been focused so far are the parallel, choice, branching, and blind operators. The present paper introduces a new important group of operators, called sequential. The latter come in the form of sequential conjunction and disjunction, sequential quantifiers, and sequential recurrences. As the name may suggest, the algorithmic intuitions associated with this group are those of sequential computations, as opposed to the intuitions of parallel computations associated with the parallel group of operations: playing a sequential combination of games means playing its components in a sequential fashion, CARDINAL after one. The main technical result of the present paper is a sound and complete axiomatization of the propositional fragment of computability logic whose vocabulary, together with negation, includes all CARDINAL -- parallel, choice and sequential -- sorts of conjunction and disjunction. An extension of this result to the ORDINAL-order level is also outlined.
There are many examples in the literature that suggest that indistinguishability is intransitive, despite the fact that the indistinguishability relation is typically taken to be an equivalence relation (and thus transitive). It is shown that if the uncertainty perception and the question of when an agent reports that CARDINAL things are indistinguishable are both carefully modeled, the problems disappear, and indistinguishability can indeed be taken to be an equivalence relation. Moreover, this model also suggests a logic of vagueness that seems to solve many of the problems related to vagueness discussed in the philosophical literature. In particular, it is shown here how the logic can handle the sorites paradox.
0
The paper explores a possible application of the discrete thermodynamics to a CARDINAL-level laser. The model accounts for the laser openness to incoming pumping power and coming out energy with the emitted light. As an open system, a laser should be in open equilibrium with thermodynamic forces, related to both energy flows. Conditions of equilibria are expressed by a logistic map with specially developed dynamic inverse pitchfork bifurcation diagrams for graphical presentation of the solutions. The graphs explicitly confirm the triggering nature of a laser where bistability is manifested by pitchfork ground and laser branches, with the relative population equilibrium values close to CARDINAL and CARDINAL correspondingly. Simulation was run for a CARDINAL-level laser emitting light from far infrared to short wave UV. A newly discovered feature of such a laser is the line spectrum of up and down transitions of the laser excitable dwellers, occurring between the laser and the ground pitchfork branches beyond bifurcation point. The density of the spectra lines tangibly increases as the branches approach their limits. Transitions of both types are overlapping in opposite phases. This effect is a new confirmation of the PERSON's prohibition on practical realization of a CARDINAL-level laser. Wide enough gaps between the lines of the spectra were also discovered in this research. The gaps are shielding the light irradiation and may be considered as potential areas of control over the CARDINAL-level laser emissions.
PERSON defined an evolutionary unit as hereditary information for which the selection bias between competing units dominates the informational decay caused by imperfect transmission. In this article, I extend PERSON' approach to show that the ratio of selection bias to transmission bias provides a unifying framework for diverse biological problems. Specific examples include GPE and ORG's mutation-selection balance, ORG's error threshold and quasispecies, PERSON clade selection, ORG's multilevel formulation of group selection, Szathmary and PERSON's evolutionary origin of primitive cells, PERSON and PERSON's short-sighted evolution of HIV virulence, PERSON's timescale analysis of microbial metabolism, and PERSON and GPE's major transitions in evolution. The insights from these diverse applications lead to a deeper understanding of kin selection, group selection, multilevel evolutionary analysis, and the philosophical problems of evolutionary units and individuality.
0
We consider the inverse mean curvature flow in ORG spacetimes that satisfy the PERSON equations and have a big crunch singularity and prove that under natural conditions the rescaled inverse mean curvature flow provides a smooth transition from big crunch to big bang. We also construct an example showing that in general the transition flow is only of class $MONEY
We consider optimization problems that are formulated and solved in the framework of tropical mathematics. The problems consist in minimizing or maximizing functionals defined on vectors of finite-dimensional semimodules over idempotent semifields, and may involve constraints in the form of ORG equations and inequalities. The objective function can be either a linear function or nonlinear function calculated by means of multiplicative conjugate transposition of vectors. We start with an overview of known tropical optimization problems and solution methods. Then, we formulate certain new problems and present direct solutions to the problems in a closed compact vector form suitable for further analysis and applications. For many problems, the results obtained are complete solutions.
0
The availability of interaction devices has raised interest in techniques to support the user interface (UI). A ORG specification describes the functions that a system provides to its users by capturing the interface details and includes possible actions through interaction elements. UI developers of interactive systems have to address multiple sources of heterogeneity, including end users heterogeneity and variability of the context of use. This paper contributes to the notion of interactivity and interfacing by proposing a methodology for producing engineering-type diagrams of (abstract) machine processes that can specify uniform structure and behavior of systems through a synchronic order of states (stages): creation, release, transfer, receive, and process. As an example, the diagrammatic methodology is applied to conceptualizing space as a machine. The resulting depiction seems suitable for use in designing UIs in certain environments.
The aim of this paper is to promote the terms thing and thinging (which refers to the act of defining a boundary around some portion of reality and labeling it with a name) as valued notions that play an important role in software engineering modeling. Additionally, we attempt to furnish operational definitions for terms thing, object, process, and thinging. The substantive discussion is based on the conception of an (abstract) machine, named ORGORG), used in several research works. The ORG creates, processes, receives, releases, and transfers things. Accordingly, a diagrammatic representation of the ORG is used to model reality. In the discussion section, this paper clarifies interesting issues related to conceptual modeling in software engineering. The substance of this paper and its conclusion suggest that thinging should be more meaningfully emphasized as a valuable research and teaching topic, at least in the requirement analysis phase of the software development cycle.
1
We propose the possibilities of designing nano-scale rectifiers using mesoscopic rings. A single mesoscopic ring is used for CARDINAL-wave rectification, while full-wave rectification is achieved using CARDINAL such rings and in both cases each ring is threaded by a time varying magnetic flux CARDINAL\phi$ which plays a central role in the rectification action. Within a tight-binding framework, all the calculations are done based on the ORG's function formalism. We present numerical results for the CARDINAL-terminal conductance and current which support the general features of CARDINAL-wave and full-wave rectifications. The analysis may be helpful in fabricating mesoscopic or nano-scale rectifiers.
In the measurement-based ORG computing, there is a natural "causal cone" among qubits of the resource state, since the measurement angle on a qubit has to depend on previous measurement results in order to correct the effect of byproduct operators. If we respect the no-signaling principle, byproduct operators cannot be avoided. In this paper, we study the possibility of acausal measurement-based ORG computing by using the process matrix framework [PERSON, PERSON, and PERSON, WORK_OF_ART {\bf3}, DATE (DATE)]. We construct a resource process matrix for acausal measurement-based ORG computing. The resource process matrix is an analog of the resource state of the causal measurement-based ORG computing. We find that the resource process matrix is (up to a normalization factor and trivial ancilla qubits) equivalent to the decorated graph state created from the graph state of the corresponding causal measurement-based ORG computing.
0
Maybe active discussions about entanglement in ORG information science demonstrate some immaturity of this rather young area. So recent tries to look for more accurate ways of classification devote rather encouragement than criticism.
In this presentation are discussed some problems, relevant with application of information technologies in nano-scale systems and devices. Some methods already developed in ORG may be very useful here. Here are considered CARDINAL illustrative models: representation of data by ORG bits and transfer of signals in ORG wires.
1
The information-theoretic point of view proposed by ORG in DATE and developed by algorithmic information theory (ORG) suggests that mathematics and physics are not that different. This will be a ORDINAL-person account of some doubts and speculations about the nature of mathematics that I have entertained for DATE, and which have now been incorporated in a digital philosophy paradigm shift that is sweeping across the sciences.
The approach defines information process from probabilistic observation, emerging microprocess,qubit, encoding bits, evolving macroprocess, and extends to Observer information self-organization, cognition, intelligence and understanding communicating information. Studying information originating in quantum process focuses not on particle physics but on natural interactive impulse modeling Bit composing information observer. Information emerges from NORP probabilities field when sequences of CARDINAL-0 probabilities link PERSON probabilities modeling arising observer. These objective yes-no probabilities virtually cuts observing entropy hidden in cutting correlation decreasing PERSON process entropy and increasing entropy of cutting impulse running minimax principle. Merging impulse curves and rotates yes-no conjugated entropies in microprocess. The entropies entangle within impulse time interval ending with beginning space. The opposite curvature lowers potential energy converting entropy to memorized bit. The memorized information binds reversible microprocess with irreversible information macroprocess. Multiple interacting Bits self-organize information process encoding causality, logic and complexity. Trajectory of observation process carries probabilistic and certain wave function self-building structural macrounits. Macrounits logically self-organize information networks encoding in triplet code. Multiple IN enclose observer information cognition and intelligence. Observer cognition assembles attracting common units in resonances forming IN hierarchy accepting only units recognizing IN node. Maximal number of accepted triplets measures the observer information intelligence. Intelligent observer recognizes and encodes digital images in message transmission enables understanding the message meaning. Cognitive logic self-controls encoding the intelligence in double helix code.
0
A large body of research in machine learning is concerned with supervised learning from examples. The examples are typically represented as vectors in a multi-dimensional feature space (also known as attribute-value descriptions). A teacher partitions a set of training examples into a finite number of classes. The task of the learning algorithm is to induce a concept from the training examples. In this paper, we formally distinguish CARDINAL types of features: primary, contextual, and irrelevant features. We also formally define what it means for one feature to be context-sensitive to another feature. Context-sensitive features complicate the task of the learner and potentially impair the learner's performance. Our formal definitions make it possible for a learner to automatically identify context-sensitive features. After context-sensitive features have been identified, there are several strategies that the learner can employ for managing the features; however, a discussion of these strategies is outside of the scope of this paper. The formal definitions presented here correct a flaw in previously proposed definitions. We discuss the relationship between our work and a formal definition of relevance.
We show that combining CARDINAL different hypothetical enhancements to quantum computation---namely, quantum advice and non-collapsing measurements---would let a ORG computer solve any decision problem whatsoever in polynomial time, even though neither enhancement yields extravagant power by itself. This complements a related result due to Raz. The proof uses locally decodable codes.
0
The ORG problem for the PERSON system is shown to be locally well-posed for low regularity Schr\"odinger data u_0 ORG,p}} and wave data (ORG,p}} \times \hat{H^{l-1,p}} under certain assumptions on the parameters k,l and 1<p\le CARDINAL, where ORG,p}}} := \| < \xi >^k \hat{u_0}\|_{L^{p'}}, generalizing the results for p=2 by PERSON, PERSON, and PERSON. Especially we are able to improve the results from the scaling point of view, and also allow suitable k<0, l<-1/2, i.e. data u_0 \not\in L^2 and (n_0,n_1)\not\in H^{-1/2}\times H^{-3/2}, which was excluded in the case p=2.
We consider the ORG system in GPE gauge and use a null condition to show local well-psoedness for low regularity data. This improves a recent result of ORG.
1
ORG) seem to have displaced traditional 'smooth' nonlinearities as activation-function-du-jour in many - but not all - deep neural network (DNN) applications. However, nobody seems to know why. In this article, we argue that PRODUCT are useful because they are ideal demodulators - this helps them perform fast abstract learning. However, this fast learning comes at the expense of serious nonlinear distortion products - decoy features. We show that ORG acts to suppress the decoy features, preventing overfitting and leaving the true features cleanly demodulated for rapid, reliable learning.
Convolutional deep neural networks (DNN) are state of the art in many engineering problems but have not yet addressed the issue of how to deal with complex spectrograms. Here, we use circular statistics to provide a convenient probabilistic estimate of spectrogram phase in a complex convolutional DNN. In a typical cocktail party source separation scenario, we trained a convolutional DNN to re-synthesize the complex spectrograms of CARDINAL source speech signals given a complex spectrogram of the monaural mixture - a discriminative deep transform (ORG). We then used this complex convolutional ORG to obtain probabilistic estimates of the magnitude and phase components of the source spectrograms. Our separation results are on a par with equivalent binary-mask based non-complex separation approaches.
1
We derive, for a bistochastic strictly contractive ORG channel on a matrix algebra, a relation between the contraction rate and the rate of entropy production. We also sketch some applications of our result to the statistical physics of irreversible processes and to quantum information processing.
This paper presents a new version of a branching batch classifier that has added fixed value ranges through bands, for each column or feature of the input dataset. Each layer branches like a tree, but has a different architecture to the current classifiers. Each branch is not for a feature, but for a change in output category. Therefore, each classifier classifies its own subset of data rows and categories, using averaged values only and with decreasing numbers of data row in each new level. When considering features however, it is shown that some of the data can be correctly classified through using fixed value ranges, while the rest can be classified by using the classifier technique. Tests show that the method can successfully classify benchmark datasets to better than the state-of-the-art. Fixed value ranges are like links and so the paper discusses the biological analogy with neurons and neuron links.
0
The black hole information paradox tells us something important about the way quantum mechanics and gravity fit together. In these lectures I try to give a pedagogical review of the essential physics leading to the paradox, using mostly pictures. Hawking's argument is recast as a `theorem': if quantum gravity effects are confined to within a given length scale and the vacuum is assumed to be unique, then there will be information loss. We conclude with a brief summary of how quantum effects in string theory violate the ORDINAL condition and make the interior of the hole a `fuzzball'.
We reminisce and discuss applications of algorithmic probability to a wide range of problems in artificial intelligence, philosophy and technological society. We propose that PERSON has effectively axiomatized the field of artificial intelligence, therefore establishing it as a rigorous scientific discipline. We also relate to our own work in incremental machine learning and philosophy of complexity.
0
Combinatorial evolution and forecasting of system requirements is examined. The morphological model is used for a hierarchical requirements system (i.e., system parts, design alternatives for the system parts, ordinal estimates for the alternatives). A set of system changes involves changes of the system structure, component alternatives and their estimates. The composition process of the forecast is based on combinatorial synthesis (knapsack problem, multiple choice problem, hierarchical morphological design). An illustrative numerical example for CARDINAL-phase evolution and forecasting of requirements to communications is described.
In this note is touched upon an application of quantum information science (QIS) in nanotechnology area. The laws of quantum mechanics may be very important for nano-scale objects. A problem with simulating of ORG systems is well known and ORG computer was initially suggested by PERSON just as the way to overcome such difficulties. Mathematical methods developed in QIS also may be applied for description of nano-devices. Few illustrative examples are mentioned and they may be related with so-called ORDINAL generation of nanotechnology products.
0
The article proposes a heuristic approximation approach to the bin packing problem under multiple objectives. In addition to the traditional objective of minimizing the number of bins, the heterogeneousness of the elements in each PERSON is minimized, leading to a biobjective formulation of the problem with a tradeoff between the number of bins and their heterogeneousness. An extension of the Best-Fit approximation algorithm is presented to solve the problem. Experimental investigations have been carried out on benchmark instances of different size, ranging from CARDINAL items. Encouraging results have been obtained, showing the applicability of the heuristic approach to the described problem.
The article presents a local search approach for the solution of timetabling problems in general, with a particular implementation for competition track CARDINAL of ORG DATE (ORG 2007). The heuristic search procedure is based on PERSON to overcome local optima. A stochastic neighborhood is proposed and implemented, randomly removing and reassigning events from the current solution. The overall concept has been incrementally obtained from a series of experiments, which we describe in each (sub)section of the paper. In result, we successfully derived a potential candidate solution approach for the finals of track CARDINAL of the ORG DATE.
1
This paper describes a new entropy-style of equation that may be useful in a general sense, but can be applied to a cognitive model with related processes. The model is based on the human brain, with automatic and distributed pattern activity. Methods for carrying out the different processes are suggested. The main purpose of this paper is to reaffirm earlier research on different knowledge-based and experience-based clustering techniques. The overall architecture has stayed essentially the same and so it is the localised processes or smaller details that have been updated. For example, a counting mechanism is used slightly differently, to measure a level of 'cohesion' instead of a 'correct' classification, over pattern instances. The introduction of features has further enhanced the architecture and the new entropy-style equation is proposed. While an earlier paper defined CARDINAL levels of functional requirement, this paper re-defines the levels in a more human vernacular, with higher-level goals described in terms of action-result pairs.
This paper continues the research that considers a new cognitive model based strongly on the human brain. In particular, it considers the neural binding structure of an earlier paper. It also describes some new methods in the areas of image processing and behaviour simulation. The work is all based on earlier research by the author and the new additions are intended to fit in with the overall design. For image processing, a grid-like structure is used with 'full linking'. Each cell in the classifier grid stores a list of all other cells it gets associated with and this is used as the learned image that new input is compared to. For the behaviour metric, a new prediction equation is suggested, as part of a simulation, that uses feedback and history to dynamically determine its course of action. While the new methods are from widely different topics, both can be compared with the binary-analog type of interface that is the main focus of the paper. It is suggested that the simplest of linking between a tree and ensemble can explain neural binding and variable signal strengths.
1
The aim of this note is to attract once again attention of the quantum community to statistical analysis of data which was reported as violating ORG's inequality. This analysis suffers of a number of problems. And the main problem is that rough data is practically unavailable. However, experiments which are not followed by the open access to the rough data have to be considered as with no result. The absence of rough data generates a variety of problems in statistical interpretation of the results of ORG's type experiment. CARDINAL may hope that this note would stimulate experimenters to create the open access data-base for, e.g., ORG tests. Unfortunately, recently announced experimental loophole-free violation of a ORG inequality using entangled ORG spins separated by QUANTITY was not supported by open-access data. Therefore in accordance with our approach "it has no result." The promising data after publication is, of course, a step towards fair analysis quantum experiments. May be this is a consequence of appearance of this preprint, v1. But there are a few questions which would be interesting to clarify before publication (and which we shall discuss in this note).
We discuss foundation of ORG (interpretations, superposition, principle of complementarity, locality, hidden variables) and quantum information theory.
1
Transcript of PERSON DATE ORG of Computer Science Distinguished Lecture. The notion of randomness is taken from physics and applied to pure mathematics in order to shed light on the incompleteness phenomenon discovered by PERSON.
This article discusses what can be proved about the foundations of mathematics using the notions of algorithm and information. The ORDINAL part is retrospective, and presents a beautiful antique, PERSON's proof, the ORDINAL modern incompleteness theorem, PERSON's halting problem, and a piece of postmodern metamathematics, the halting probability PERSON. The ORDINAL part looks forward to DATE and discusses the convergence of theoretical physics and theoretical computer science and hopes for a theoretical biology, in which the notions of algorithm and information are again crucial.
1
Multi-agent approach has become popular in computer science and technology. However, the conventional models of multi-agent and multicomponent systems implicitly or explicitly assume existence of absolute time or even do not include time in the set of defining parameters. At the same time, it is proved theoretically and validated experimentally that there are different times and time scales in a variety of real systems - physical, chemical, biological, social, informational, etc. Thus, the goal of this work is construction of a multi-agent multicomponent system models with concurrency of processes and diversity of actions. To achieve this goal, a mathematical system actor model is elaborated and its properties are studied.
ORG means statistical analysis of population or sample that has indeterminate (imprecise, ambiguous, vague, incomplete, unknown) data. For example, the population or sample size might not be exactly determinate because of some individuals that partially belong to the population or sample, and partially they do not belong, or individuals whose appurtenance is completely unknown. Also, there are population or sample individuals whose data could be indeterminate. In this book, we develop the DATE notion of neutrosophic statistics. We present various practical examples. It is possible to define the neutrosophic statistics in many ways, because there are various types of indeterminacies, depending on the problem to solve.
0
In this article, we perform a systematic study of the mass spectrum of the axial-vector hidden charmed and hidden bottom tetraquark states using the ORG sum rules, and identify the $Z^+(4430)$ as an axial-vector tetraquark state tentatively.
In this paper, we consider supervised learning problems such as logistic regression and study the stochastic gradient method with averaging, in the usual stochastic approximation setting where observations are used only once. We show that after $MONEY iterations, with a constant step-size proportional to MONEY \sqrt{N}$ where $MONEY is the number of observations and $MONEY is the maximum norm of the observations, the convergence rate is always of order $PERSON, and improves to $O(R^2 / \mu N)$ where $\mu$ is the lowest eigenvalue of the Hessian at the global optimum (when this eigenvalue is MONEYR^2/\sqrt{N}$). Since $\mu$ does not need to be known in advance, this shows that averaged stochastic gradient is adaptive to \emph{unknown local} strong convexity of the objective function. Our proof relies on the generalized self-concordance properties of the logistic loss and thus extends to all generalized ORG models with uniformly bounded features.
0
The article sets forth comprehensive basics of thermodynamics of chemical equilibrium as balance of the thermodynamic forces. Based on the linear equations of irreversible thermodynamics, ORG definition of the thermodynamic force, and FAC principle, new thermodynamics of chemical equilibrium offers an explicit account for multiple chemical interactions within the system. Basic relations between energetic characteristics of chemical transformations and reaction extents are based on the idea of chemical equilibrium as balance between internal and external thermodynamic forces, which is presented in the form of a logistic equation, containing CARDINAL new parameter. Solutions to the basic equation define the domain of states of the chemical system, from true equilibrium to true chaos. The new theory is derived exclusively from the currently recognized ideas and covers equilibrium thermodynamics as well as non-equilibrium thermodynamics in a unique concept.
The paper presents new thermodynamic paradigm of chemical equilibrium, setting forth comprehensive basics of ORG (DTd). Along with previous results by the author during DATE, this work contains also some new developments of DTd. Based on the ORG's constitutive equations, reformulated by the author thermodynamic affinity and reaction extent, and FAC principle, DTd brings forward a notion of chemical equilibrium as a balance of internal and external thermodynamic forces (TdF), acting against a chemical system. Basic expression of DTd is the chemical system logistic map of thermodynamic states that ties together energetic characteristics of chemical reaction, occurring in the system, the system shift from "true" thermodynamic equilibrium (ORG), and causing that shift external thermodynamic forces. Solutions to the basic map are pitchfork bifurcation diagrams in coordinates "shift from ORG - growth factor (or TdF)"; points, corresponding to the system thermodynamic states, are dwelling on its branches. The diagrams feature CARDINAL typical areas: true thermodynamic equilibrium and open equilibrium along the thermodynamic branch before the threshold of its stability, i.e. bifurcation point, and bifurcation area with bistability and chaotic oscillations after the point. The set of solutions makes up the chemical system domain of states. The new paradigm complies with the correspondence principle: in isolated chemical system external TdF vanish, and the basic map turns into traditional expression of chemical equilibrium via thermodynamic affinity. The theory binds together classical and contemporary thermodynamics of chemical equilibria on a unique conceptual basis. The paper is essentially reworked and refocused version of the earlier preprint on the DTd basics, supplemented with new results.
1
We consider the problem of high-dimensional non-linear variable selection for supervised learning. Our approach is based on performing linear selection among exponentially many appropriately defined positive definite kernels that characterize non-linear interactions between the original variables. To select efficiently from these many kernels, we use the natural hierarchical structure of the problem to extend the multiple kernel learning framework to kernels that can be embedded in a directed acyclic graph; we show that it is then possible to perform kernel selection through a graph-adapted sparsity-inducing norm, in polynomial time in the number of selected kernels. Moreover, we study the consistency of variable selection in high-dimensional settings, showing that under certain assumptions, our regularization framework allows a number of irrelevant variables which is exponential in the number of observations. Our simulations on synthetic datasets and datasets from the ORG repository show state-of-the-art predictive performance for non-linear regression problems.
Hawking's black hole information puzzle highlights the incompatibility between our present understanding of gravity and quantum physics. However, Hawking's prediction of black-hole evaporation is at a semiclassical level. CARDINAL therefore suspects some modifications of the character of the radiation when quantum properties of the {\it black hole itself} are properly taken into account. In fact, during DATE evidence has been mounting that, in a quantum theory of gravity black holes may have a discrete mass spectrum, with concomitant {ORG discrete} line emission. A direct consequence of this intriguing prediction is that, compared with blackbody radiation, black-hole radiance is {\it less} entropic, and may therefore carry a significant amount of {ORG information}. Using standard ideas from quantum information theory, we calculate the rate at which information can be recovered from the black-hole spectral lines. We conclude that the information that was suspected to be lost may gradually leak back, encoded into the black-hole spectral lines.
0
Game theoretic equilibria are mathematical expressions of rationality. Rational agents are used to model not only humans and their software representatives, but also organisms, populations, species and genes, interacting with each other and with the environment. Rational behaviors are achieved not only through conscious reasoning, but also through spontaneous stabilization at equilibrium points. Formal theories of rationality are usually guided by informal intuitions, which are acquired by observing some concrete economic, biological, or network processes. Treating such processes as instances of computation, we reconstruct and refine some basic notions of equilibrium and rationality from the some basic structures of computation. It is, of course, well known that equilibria arise as fixed points; the point is that semantics of computation of fixed points seems to be providing novel methods, algebraic and GPE, for reasoning about them.
The diverse views of science of security have opened up several alleys towards applying the methods of science to security. We pursue a different kind of connection between science and security. This paper explores the idea that security is not just a suitable subject for science, but that the process of security is also similar to the process of science. This similarity arises from the fact that both science and security depend on the methods of inductive inference. Because of this dependency, a scientific theory can never be definitely proved, but can only be disproved by new evidence, and improved into a better theory. Because of the same dependency, every security claim and method has a lifetime, and always eventually needs to be improved. In this general framework of security-as-science, we explore the ways to apply the methods of scientific induction in the process of trust. The process of trust building and updating is viewed as hypothesis testing. We propose to formulate the trust hypotheses by the methods of algorithmic learning, and to build more robust trust testing and vetting methodologies on the solid foundations of statistical inference.
1
We study light vector meson electroproduction at small $x$ within the generalized parton distributions (GPDs) model. The modified perturbative approach is used, where the quark transverse degrees of freedom in the vector meson wave function and hard subprocess are considered. Our results on ORG section and spin observables are in good agreement with experiment
On the basis of the handbag approach we study cross sections and spin asymmetries for leptoproduction of various vector and pseudoscalar mesons. Our results are in good agrement with high energy experiments. We analyse what information about ORG (GPDs) can be obtained from these reactions.
1
The commonly used circuit model of ORG computing leaves out the problems of imprecision in the initial state preparation, particle statistics (indistinguishability of particles belonging to the same quantum state), and error correction (current techniques cannot correct all small errors). The initial state in the circuit model computation is obtained by applying potentially imprecise ORG gate operations whereas useful quantum computation requires a state with no uncertainty. We review some limitations of the circuit model and speculate on the question if a hierarchy of quantum-type computing models exists.
GPE computing is the use of multiple autonomic and parallel modules together with integrative processors at a higher level of abstraction to embody "intelligent" processing. The biological basis of this computing is sketched and the matter of learning is examined.
1
We show in this article that if a holomorphic vector bundle has a nonnegative NORP metric in the sense of PERSON and ORG, which always exists on globally generated holomorphic vector bundles, then some special linear combinations of ORG forms are strongly nonnegative. This particularly implies that all the ORG numbers of such a holomorphic vector bundle are nonnegative and can be bounded below and above respectively by CARDINAL special ORG numbers. As applications, we obtain a family of new results on compact connected complex manifolds which are homogeneous or can be holomorphically immersed into complex tori, some of which improve several classical results.
Separation of competing speech is a key challenge in signal processing and a feat routinely performed by the human auditory brain. A long standing benchmark of the spectrogram approach to source separation is known as the ideal binary mask. Here, we train a convolutional deep neural network, on a CARDINAL-speaker cocktail party problem, to make probabilistic predictions about binary masks. Our results approach ideal binary mask performance, illustrating that relatively simple deep neural networks are capable of robust binary mask prediction. We also illustrate the trade-off between prediction statistics and separation quality.
0
We analyse the diffractive $Q \bar Q$ production and final jet kinematics in polarized deep-inelastic lp scattering at $\sqrt{s}=20 GeV$. We show that this reaction can be used in the new spectrometer of the COMPASS Collaboration at GPE to study the quark-pomeron coupling structure.
Connections between the sequentiality/concurrency distinction and the semantics of proofs are investigated, with particular reference to games and ORG.
0
We present new findings in regard to data analysis in very high dimensional spaces. We use dimensionalities up to CARDINAL. A particular benefit of ORG is its suitability for carrying out an orthonormal mapping, or scaling, of power law distributed data. Power law distributed data are found in many domains. Correspondence factor analysis provides a latent semantic or principal axes mapping. Our experiments use data from digital chemistry and finance, and other statistically generated data.
Errors in data are usually unwelcome and so some means to correct them is useful. However, it is difficult to define, detect or correct errors in an unsupervised way. Here, we train a deep neural network to re-synthesize its inputs at its output layer for a given class of data. We then exploit the fact that this abstract transformation, which we call a deep transform (ORG), inherently rejects information (errors) existing outside of the abstract feature space. Using the ORG to perform probabilistic re-synthesis, we demonstrate the recovery of data that has been subject to extreme degradation.
0
We derive an exact and efficient NORP regression algorithm for piecewise constant functions of unknown segment number, boundary location, and levels. It works for any noise and segment level prior, ORG which can handle outliers. We derive simple but good estimates for the in-segment variance. We also propose a NORP regression curve as a better way of smoothing data without blurring boundaries. The NORP approach also allows straightforward determination of the evidence, break probabilities and error estimates, useful for model selection and significance and robustness studies. We discuss the performance on synthetic and real-world examples. Many possible extensions will be discussed.
PERSON's uncomputable universal prediction scheme $\xi$ allows to predict the next symbol $x_k$ of a sequence $x_1...x_{k-1}$ for any Turing computable, but otherwise unknown, probabilistic environment $\mu$. This scheme will be generalized to arbitrary environmental classes, which, among others, allows the construction of computable universal prediction schemes $\xi$. Convergence of $\xi$ to $\mu$ in a conditional mean squared sense and with $\mu$ probability CARDINAL is proven. It is shown that the average number of prediction errors made by the universal $\xi$ scheme rapidly converges to those made by the best possible informed $\mu$ scheme. The schemes, theorems and proofs are given for general finite alphabet, which results in additional complications as compared to the binary case. Several extensions of the presented theory and results are outlined. They include general loss functions and bounds, games of chance, infinite alphabet, partial and delayed prediction, classification, and more active systems.
1
Computability logic is a formal theory of computational tasks and resources. PERSON in it represent interactive computational problems, and "truth" is understood as algorithmic solvability. Interactive computational problems, in turn, are defined as a certain sort games between a machine and its environment, with logical operators standing for operations on such games. Within the ambitious program of finding axiomatizations for incrementally rich fragments of this semantically introduced logic, the earlier article "From truth to computability I" proved soundness and completeness for system PERSON, whose language has the so called parallel connectives (including negation), choice connectives, choice quantifiers, and blind quantifiers. The present paper extends that result to the significantly more expressive system CL4 with the same collection of logical operators. What makes CL4 expressive is the presence of CARDINAL sorts of atoms in its language: elementary atoms, representing elementary computational problems (i.e. predicates, i.e. problems of CARDINAL degree of interactivity), and general atoms, representing arbitrary computational problems. CL4 conservatively extends PERSON, with the latter being nothing but the general-atom-free fragment of the former. Removing the blind (classical) group of quantifiers from the language of CL4 is shown to yield a decidable logic despite the fact that the latter is still ORDINAL-order. A comprehensive online source on computability logic can be found at ORG
We propose a new class of ORG computing algorithms which generalize many standard ones. The goal of our algorithms is to estimate probability distributions. Such estimates are useful in, for example, applications of WORK_OF_ART, where inferences are made based on uncertain knowledge. The class of algorithms that we propose is based on a construction method that generalizes a Fredkin-Toffoli (F-T) construction method used in the field of classical reversible computing. F-T showed how, given any binary deterministic circuit, one can construct another binary deterministic circuit which does the same calculations in a reversible manner. We show how, given any classical stochastic network (classical NORP net), one can construct a quantum network (quantum NORP net). By running this quantum NORP net on a ORG computer, one can calculate any conditional probability that one would be interested in calculating for the original classical NORP net. Thus, we generalize PRODUCT construction method so that it can be applied to any classical stochastic circuit, not just binary deterministic ones. We also show that, in certain situations, our class of algorithms can be combined with PERSON's algorithm to great advantage.
0
This paper is devoted to expressiveness of hypergraphs for which uncertainty propagation by local computations via Shenoy/Shafer method applies. It is demonstrated that for this propagation method for a given joint belief distribution no valuation of hyperedges of a hypergraph may provide with simpler hypergraph structure than valuation of hyperedges by conditional distributions. This has vital implication that methods recovering belief networks from data have no better alternative for finding the simplest hypergraph structure for belief propagation. A method for recovery tree-structured belief networks has been developed and specialized for PERSON belief functions
Several approaches of structuring (factorization, decomposition) of PERSON joint belief functions from literature are reviewed with special emphasis on their capability to capture independence from the point of view of the claim that belief functions generalize bayes notion of probability. It is demonstrated that PERSON and PERSON's {Zhu:93} logical networks and NORP' {Smets:93} directed acyclic graphs are unable to capture statistical dependence/independence of NORP networks {Pearl:88}. On the other hand, though Shenoy and GPE's hypergraphs can explicitly represent bayesian network factorization of NORP belief functions, they disclaim any need for representation of independence of variables in belief functions. Cano et al. {Cano:93} reject the hypergraph representation of Shenoy and GPE just on grounds of missing representation of variable independence, but in their frameworks some belief functions factorizable in ORG framework cannot be factored. The approach in {Klopotek:93f} on the other hand combines the merits of both Cano et al. and of ORG approach in that for Shenoy/Shafer approach no simpler factorization than that in {GPE} approach exists and on the other hand all independences among variables captured in GPE et al. framework and many more are captured in {Klopotek:93f} approach.%
1
The speed and transformative power of human cultural evolution is evident from the change it has wrought on our planet. This chapter proposes a human computation program aimed at (CARDINAL) distinguishing algorithmic from non-algorithmic components of cultural evolution, (CARDINAL) computationally modeling the algorithmic components, and amassing human solutions to the non-algorithmic (generally, creative) components, and (CARDINAL) combining them to develop human-machine hybrids with previously unforeseen computational power that can be used to solve real problems. Drawing on recent insights into the origins of evolutionary processes from biology and complexity theory, human minds are modeled as self-organizing, interacting, autopoietic networks that evolve through a GPE (NORP) process of communal exchange. Existing computational models as well as directions for future research are discussed.
General-purpose, intelligent, learning agents cycle through sequences of observations, actions, and rewards that are complex, uncertain, unknown, and NORP. On the other hand, reinforcement learning is well-developed for small finite state PERSON decision processes (MDPs). Up to now, extracting the right state representations out of bare observations, that is, reducing the general agent setup to the ORG framework, is an art that involves significant effort by designers. The primary goal of this work is to automate the reduction process and thereby significantly expand the scope of many existing reinforcement learning algorithms and the agents that employ them. Before we can think of mechanizing this search for suitable MDPs, we need a formal objective criterion. The main contribution of this article is to develop such a criterion. I also integrate the various parts into CARDINAL learning algorithm. Extensions to more realistic dynamic NORP networks are developed in Part II. The role of POMDPs is also considered there.
0
In a previous paper, we showed how entanglement of formation can be defined as a minimum of the quantum conditional mutual information (a.k.a. ORG). In classical information theory, the NORP-Blahut method is one of the preferred methods for calculating extrema of mutual information. In this paper, we present a new method, akin to the NORP-Blahut method, for calculating entanglement of formation. We also present several examples computed with a computer program called PERSON that implements the ideas of this paper.
ORG (QMR) is a compendium of statistical knowledge connecting diseases to findings (symptoms). The information in ORG can be represented as a NORP network. The inference problem (or, in more medical language, giving a diagnosis) for the ORG is to, given some findings, find the probability of each disease. Rejection sampling and likelihood weighted sampling (a.k.a. likelihood weighting) are CARDINAL simple algorithms for making approximate inferences from an arbitrary NORP net (and from the QMR NORP net in particular). Heretofore, the samples for these CARDINAL algorithms have been obtained with a conventional "classical computer". In this paper, we will show that CARDINAL analogous algorithms exist for the QMR NORP net, where the samples are obtained with a ORG computer. We expect that these CARDINAL algorithms, implemented on a quantum computer, can also be used to make inferences (and predictions) with other NORP nets.
1
ORG computers use continuous properties of physical system for modeling. In the paper is described possibility of modeling by analogue ORG computers for some model of data analysis. It is analogue associative memory and a formal neural network. A particularity of the models is combination of continuous internal processes with discrete set of output states. The modeling of the system by classical analogue computers was offered long times ago, but now it is not very effectively in comparison with modern digital computers. The application of ORG analogue modelling looks quite possible for modern level of technology and it may be more effective than digital one, because number of element may be about PERSON number (N=6.0E23).
This paper presents a CARDINAL-valued representation of bifuzzy sets. This representation is related to a CARDINAL-valued logic that uses the following values: true, false, inconsistent, incomplete and ambiguous. In the framework of CARDINAL-valued representation, formulae for similarity, entropy and syntropy of bifuzzy sets are constructed.
0
The place of an anthropic argument in the discrimination between various cosmological models is to be reconsidered following the classic criticisms of PERSON and PERSON. Different versions of the anthropic argument against cosmologies involving an infinite series of past events are analyzed and applied to several instructive instances. This is not only of historical significance but presents an important topic for the future of cosmological research if some of the contemporary inflationary models, particularly ORG's chaotic inflation, turn out to be correct. Cognitive importance of the anthropic principle(s) to the issue of extraterrestrial intelligent observers is reconsidered in this light and several related problems facing cosmologies with past temporal infinities are also clearly defined. This issue is not only a clear example of the epistemological significance of the anthropic principle, but also has consequences for such diverse topics as ORG studies, epistemological status of cosmological concepts, theory of observation selection effects, and history of astronomy.
The intriguing suggestion of ORG (DATE) that the universe--contrary to all our experiences and expectations--contains only a small amount of information due to an extremely high degree of internal symmetry is critically examined. It is shown that there are several physical processes, notably Hawking evaporation of black holes and NORP decoherence time effects described by PERSON, as well as thought experiments of GPE and GPE himself, which can be construed as arguments against the low-information universe hypothesis. In addition, an extreme form of physical reductionism is entailed by this hypothesis, and therefore any possible argumentation against such reductionism would count against it either. Some ramifications for both quantum mechanics and cosmology are briefly discussed.
1
An overview of recent ORG results on inclusive production of D* mesons in deep inelastic scattering is given.
There exists a large number of experimental and theoretical results supporting the picture of "macroscopic qubits" implemented, for instance, by ORG atoms, PERSON junctions or ORG condensates - the systems which should rather emerge in localized semiclassical states. In this note it is shown how, under realistic conditions, the false qubit interpretation can be consistent with the restricted set of experimental data collected for semiclassical systems. The recent experiments displaying semiclassical character of ORG condensates and possible quantumness tests for a single system are briefly invoked also.
0
Hidden variables are well known sources of disturbance when recovering belief networks from data based only on measurable variables. Hence models assuming existence of hidden variables are under development. This paper presents a new algorithm "accelerating" the known ORG algorithm of Spirtes, Glymour and ORG {Spirtes:93}. We prove that this algorithm does not produces (conditional) independencies not present in the data if statistical independence test is reliable. This result is to be considered as non-trivial since e.g. the same claim fails to be true for ORG algorithm, another "accelerator" of ORG, developed in {Spirtes:93}.
It is proven, by example, that the version of $k$-means with random initialization does not have the property \emph{probabilistic $k$-richness}.
1
This paper is a survey discussing ORG concepts, methods, and applications. It goes deep into the document and query modelling involved in ORG systems, in addition to pre-processing operations such as removing stop words and searching by synonym techniques. The paper also tackles text categorization along with its application in neural networks and machine learning. Finally, the architecture of web crawlers is to be discussed shedding the light on how internet spiders index web documents and how they allow users to search for items on the web.
CARDINAL of the main purposes of a computer is automation. In fact, automation is the technology by which a manual task is performed with minimum or CARDINAL human assistance. Over DATE, automation has proved to reduce operation cost and maintenance time in addition to increase system productivity, reliability, and performance. DATE, most computerized automation are done by a computer program which is a set of instructions executed from within the computer memory by the computer central processing unit to control the computers various operations. This paper proposes a compiler program that automates the validation and translation of input documents written in the LANGUAGE language into ORG output files that can be read by a computer. The input document is by nature unstructured and in plain-text as it is written by people manually; while, the generated output is a structured machine-readable XML file. The proposed compiler program is actually a part of a bigger project related to digital government and is meant to automate the processing and archiving of juridical data and documents. In essence, the proposed compiler program is composed of a scanner, a parser, and a code generator. Experiments showed that such automation practices could prove to be a starting point for a future digital government platform for the NORP government. As further research, other types of juridical documents are to be investigated, mainly those that require error detection and correction.
1
This paper looks at Turing's postulations about ORG in his paper 'Computing Machinery and ORG', published in DATE. It notes how accurate they were and how relevant they still are DATE. This paper notes the arguments and mechanisms that he suggested and tries to expand on them further. The paper however is mostly about describing the essential ingredients for building an intelligent model and the problems related with that. The discussion includes recent work by the author himself, who adds his own thoughts on the matter that come from a purely technical investigation into the problem. These are personal and quite speculative, but provide an interesting insight into the mechanisms that might be used for building an intelligent system.
This paper describes some biologically-inspired processes that could be used to build the sort of networks that we associate with the human brain. New to this paper, a 'refined' neuron will be proposed. This is a group of neurons that by joining together can produce a more analogue system, but with the same level of control and reliability that a binary neuron would have. With this new structure, it will be possible to think of an essentially binary system in terms of a more variable set of values. The paper also shows how recent research associated with the new model, can be combined with established theories, to produce a more complete picture. The propositions are largely in line with conventional thinking, but possibly with CARDINAL or CARDINAL more radical suggestions. An earlier cognitive model can be filled in with more specific details, based on the new research results, where the components appear to fit together almost seamlessly. The intention of the research has been to describe plausible 'mechanical' processes that can produce the appropriate brain structures and mechanisms, but that could be used without the magical 'intelligence' part that is still not fully understood. There are also some important updates from an earlier version of this paper.
1